Database Benchmarking

Wednesday Jan 26th 2005 by Steve Callan

Steve Callan introduces you to what the benchmark tests are, who controls or regulates them, and shows how Oracle compares to other database systems.

If you have seen advertising literature from major database system vendors, invariably you have seen statements about how that database performed against other systems (benchmark results). All of the major vendors support and participate in benchmarking tests because, to some degree, they have to. The mindset must be along the lines of "Everyone else is doing it, so we'd look odd not participating in the tests ourselves - even though most analysts agree the tests are somewhat artificial and flawed."

A good introduction into database performance benchmarking can be found in an Oracle Magazine (March-April 2001) by Fred Sandsmark ( And despite what you may think before reading the article, Mr. Sandsmark does not give Oracle preferential treatment just because he wrote this article for Oracle. Who is the referee or judge of database benchmarking? No one is going to believe Oracle, Microsoft, or anyone else is going to be impartial, and why should they be? They are businesses with the same end goal in mind: increased profitability for owners and shareholders. When Oracle, for example, claims it has the best-performing database on System X using Processor Y, that is probably true, and the results can be verified, but when you see just how many system/platform combinations there are in the benchmarking results and what the results were, the value of the claim may be diminished.

One important concept to take away from this discussion is that there is no singular, all encompassing, definitive test that allows a vendor to claim their system is the best one out there, no ands, ifs or buts. For Oracle, Microsoft, IBM, or Sybase to claim they are the best overall, well, it's simply not true. A particular system can be the best on a particular platform under certain conditions, but to say a particular system is the best overall is going to make that vendor suspect with respect to credibility.

The purpose of this article - from an Oracle DBA standpoint - is to introduce you to what the benchmark tests are, who controls or regulates them, and show you how Oracle compares to other database systems. Three letters will over and over again come into play when vendors compare themselves to one another, and those letters are TCO (total cost of ownership). Defining what factors into and how exactly you measure TCO is an issue in of itself.

When someone (your management at budget time?) says, "Sybase rocks" or "SQL Server rules, just look at these benchmark results," it is useful to know what those results mean, where they came from, and how they were obtained (under what conditions).

The Transaction Processing Performance Council

Yes, it is another one of those somewhere-on-the-Internet entities that wields power and control over our lives. The TPC "defines transaction processing and database benchmarks and delivers trusted results to the industry." The home page is at, and you can see who the major members are in the picture below.

As you can readily see, all of the big names - platform, operating system and database system - are represented.

From the "About the TPC" page, the TPC "produces benchmarks that measure transaction processing (TP) and database (DB) performance in terms of how many transactions a given system and database can perform per unit of time, e.g., transactions per second or transactions per minute." Two commonly used metrics are variations of transactions per unit of time and cost per transaction.

The TPC benchmarks are divided into four categories: TPC-C, TPC-H, TPC-R, and TPC-W, and each category has it own set of metrics as just mentioned. In the interest of space and time, let's focus on two of them: TPC-C and TPC-R.

The TPC-C Benchmark

From the TPC Web site, the TPC-C benchmark is defined as:

TPC-C simulates a complete computing environment where a population of users executes transactions against a database. The benchmark is centered around the principal activities (transactions) of an order-entry environment. These transactions include entering and delivering orders, recording payments, checking the status of orders, and monitoring the level of stock at the warehouses. While the benchmark portrays the activity of a wholesale supplier, TPC-C is not limited to the activity of any particular business segment, but, rather represents any industry that must manage, sell, or distribute a product or service.

TPC-C involves a mix of five concurrent transactions of different types and complexity either executed on-line or queued for deferred execution. It does so by exercising a breadth of system components associated with such environments, which are characterized by:

  • The simultaneous execution of multiple transaction types that span a breadth of complexity
  • On-line and deferred transaction execution modes
  • Multiple on-line terminal sessions
  • Moderate system and application execution time
  • Significant disk input/output
  • Transaction integrity (ACID properties)
  • Non-uniform distribution of data access through primary and secondary keys
  • Databases consisting of many tables with a wide variety of sizes, attributes, and relationships
  • Contention on data access and update

TPC-C performance is measured in new-order transactions per minute.  The primary metrics are the transaction rate (tpmC), the associated price per transaction ($/tpmC), and the availability date of the priced configuration.

TPC-C sounds like a reasonable test for a transactional database. However, if you look further into the TPC-C benchmark, you will see that there are versions of the test, so the results of how a system did using one version may not be comparable to the same system using the latest version (5.3). TPC-C is the most popular benchmark test, so it is probably the results you have seen Oracle advertise. As a test of this, perform a search on benchmark at Oracle's home page, and go to the Benchmark Results - Transaction Processing link.

Let's drill down into the first bulleted item (TPC-C: Oracle and HP ... on Linux).

Reading the fine print at the bottom, Oracle cites the Transaction Processing Council results, but look at the specifics of what it took to get those results. On that particular system/platform combination, using the specific test established by the TPC, Oracle exceeded one million transactions per minute. Again, this result is based on fairly exact conditions whose results may - and do - differ when run on other system/platform combinations. Using only the tpmC metric, did anyone else exceed the one million transactions per minute mark?

Now that you know about the TPC and the results it lists, two of the top three tpmC results were obtained by none other than IBM - and one of those spots is first place. See the page at and order by descending tpmC. Oracle's results are impressive, but not nearly so as the results obtained with the IBM eServer/DB2 database on AIX combination and it is nearly three times as much tpmC as Oracle. Further, the IBM competitor has a lower price per tpmC. However, don't try this test at home, boys and girls. If you click on the "IBM eServer p5 595 64p" link in the System column, you may be surprised to see how much this system cost (over 16 million dollars). The results shown at Oracle's Web site only tell part of the story - when using either tpmC or Price/tpmC, Oracle is not the best overall. Nevertheless, technically speaking, the results (using tpmC) shown by Oracle are the best on the market today as the IBM "winner" isn't available until May 2005.

The TPC-R Benchmark

The other TPC benchmark category that is of interest is the TPC-R. This benchmark is analogous to knowing what the test questions are ahead of time. In database terms then, the vendor can front load whatever to help make the known query run as fast as possible. This benchmark is controversial in that what is it really being measured and how can you compare it to another system? Even the TPC has a disclaimer on the use of this benchmark ("The TPC believes that comparisons of TPC-R results measured against different database sizes are misleading and discourages such comparisons."). Only Oracle is shown on the results page.

Click for full image

However, on Oracle's Web site, there is an entire page dedicated to highlighting these results.

Click for larger image

You can interpret the TPC-R results with a grain of salt. The TPC discourages this particular benchmark test, and Oracle shows how well it did by being the only contestant in the race. Is that a credibility problem, or just market/sales spin?

The Benchmark Tests

You can download the TPC-C specifications from the TPC Web site. Additionally, there are links to FAQ's and a PowerPoint presentation. Aside from the dated results, the PowerPoint presentation may be quite useful to you if you are the one tasked to explain what benchmarking is. Additionally, you can use the information on the slides if you need some basis to form a recommendation on which database product to purchase (given a particular system) with respect to what metrics to use.

For the TPC-R test, you can actually download the C files used to create ASCII files with delimited data, and this is something you can try at home. Watch your disk space because the scale factor setting can use up the available space on your system. Your Dell Dimension 8200 running XP Professional with its 200MB database, unfortunately, was not what the TPC had in mind for testing on high-end systems.

In Closing

Whenever you see a claim by a database vendor about how its RDBMS can do great and wondrous things on a fill-in-the-blank system, you now know where to go to verify those claims for yourself, but more importantly, you can see how that vendor's results compare to other database system/platform combinations. Oracle, for example, gives great press to its TPC results, but when viewed in comparison with how other combinations faired, you can develop a more informed opinion about what the results mean.

» See All Articles by Columnist Steve Callan

Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved