Case Study: Storing, Querying, and Analyzing Performance: Pt. 3

Friday Dec 22nd 2000 by Bruce Szabo

We can now store and retrieve our multi-option responses a single integer; but how efficient are our query routines? Time to take a step back and perform some good old-fashioned performance analysis.

Querying Performance

Using a novel method of storing data in a database has allowed survey data of different types to be held in a single table. This allows the data to be queried and cross-referenced using fields from the single table or when looking for specific company information the query can join two tables. The last step of any new methodology should be a look at the performance of the methodology.

In most business environments it has been my experience that people overbuy on their hardware, this has lead me to be a tad lackadaisical when it comes to trying to refine the performance of applications. In doing this article it became evident a performance analysis was in order. The original method used to analyze performance was an ASP page determining how long it takes to query for specific articles. Because the focus of these articles is SQL I decided to figure out how to analyze performance on the SQL server.

In order to analyze the methodology discussed in the previous two articles I created two tables and populated them with 5,000 records each. The first table has two columns, an ID column and an integer column (referred to as the integer table) while the second table has 33 columns an ID column and 32 logical columns (referred to as the logical table). Using Transact-SQL I wrote the following queries to test performance. Each query is followed by the results.

Click here for code example 1.

Click here for code example 2.

The Queries follow the same format. Two variables are declared which were used for the start and stop times of the query. The start variable (@start) was set to the current time (getdate()) before the select statement. When the query finishes the end variable (@finished) is set to the current time. Taking the difference between these two values in the final select statement yields the length of time it took for the query. As one can see from the results the logical bit query took less time. The result was expected as it makes sense finding a logical value directly will be quicker than performing a calculation on each integer. The 3 millisecond difference, however, seems negligible for ease of use of this table.

When two bits where queried the results were more dramatic as it took 33 milliseconds for the integer query to run and only 16 milliseconds to query two bits from the table with logical fields. From the query perspective it would make sense to use a table with logical fields to store the data from the checkbox responses on the survey.

It can be argued, however, that the space required to store the information is as important as the response time for the query. The space required for the 5000 records of the integer table was 12 pages while it took 17 pages to store the same information in the logical table. In SQL 7.0 a page is 8k in size meaning the storage size would translate to 96k (8k * 12 pages) and 136k (8k * 17 pages) pages, respectively. If I add another 5000 records the numbers were similar the integer table required 184k (8k * 23 pages) and 264 k (8k * 33 pages).

In this 3 part series I hope to have presented a method to translate data so it can be stored in a database (Storing Responses). Part 2 of this series discussed querying the data from the database (Querying Responses) while the performance of the table in the database is discussed in the final installment.

Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved