Part 2 of this series explains how SQL Plan Management can be used during the upgrade of an existing Oracle 10gR2 database to an Oracle 11g environment, as well as during the deployment of brand new application code, to effectively limit unexpected regression of SQL statement performance.
Synopsis. Oracle Database 11gR1s new SQL Plan Management tool set gives any Oracle DBA the ability to capture and preserve the most efficient execution plans for any SQL statement. This article the second in this series explains how SQL Plan Management can be used during the upgrade of an existing Oracle 10gR2 database to an Oracle 11g environment, as well as during the deployment of brand new application code, to effectively limit unexpected regression of SQL statement performance.
The previous article in this series provided a primer for Oracle Database 11gs new SQL Plan Management (SPM) features, including some rudimentary examples of how SPM tools help sift through different execution plans to identify and isolate only the best plans to improve SQL statement performance.
Since Ive already explained and demonstrated the basic architecture of SQL Plan Management, Ill now shift our focus to discuss two scenarios that every Oracle DBA has encountered: the upgrade of an existing Oracle database to a newer Oracle release, and the deployment of brand new application code against an existing database. While the previous article demonstrated how to use Oracle 11gs new DBMS_SPM package to capture new SQL Plan Baselines, Ill use these two scenarios to illustrate the power of Oracle 11g Enterprise Manager Database Controls SQL Plan Control to capture new candidates for SQL Plan Baseline creation as well as manage existing SQL Plan Baselines.
SPM Scenario #1: Upgrading an Existing Database
In my humble opinion, the upgrade of an existing database to the next software release is one of the most stressful situations even an experienced Oracle DBA can undergo because it can be extremely difficult to determine exactly which statements are performing poorly after the upgrade. In pre-Oracle 11g environments, Ive found the best method to limit this uncertainty is to construct as nearly a perfect duplicate of my production environment on my QA server, capture an adequate SQL workload of the most critical statements for my applications, and capture EXPLAIN PLANs for those statements. Then once Ive thrown the switch to upgrade the QA database and environment to the next database release, Id again generate EXPLAIN PLANs for these same statements and compare the results to find any regressing statements.
While this brute force testing method has served me quite well prior to Oracle 11g, Ive always hoped for a more reliable method to determine exactly what the impact of an upgrade would be upon the performance of existing SQL statements. But as Ive demonstrated in the prior article series on SQL Performance Analyzer, its now extremely simple to isolate any SQL statements whose performance would regress as a result, even for relatively minor intra-release upgrades (e.g. 11.1.0.5.0 to 11.1.0.6.0). Once all regressing SQL statements have been identified with SQL Performance Analyzer, Ill bring the full power of SQL Plan Management to bear by capturing those statements into a SQL Tuning Set (STS) before I perform the upgrade to the database.
Since an STS captures the statements SQL text, bind variables, execution plans, and execution statistics, Ill retain them until just after the database version upgrade is completed, at which time Ill transform these statements execution plans into SQL Plan Baselines. When these statements are executed for the first time against the upgraded database, however, the cost-based optimizer (CBO) detects that a SQL Plan Baseline is already available. If the CBO decides that the SQL Plan Baseline offers a more efficient execution plan, it will use the baselined plan instead. The end result is that a potentially serious SQL plan regression is completely avoided.
Gathering a SQL Workload. To demonstrate these concepts, Ill first create a SQL Workload against an Oracle 10gR2 database. Ill use the five queries against several tables in the Sales History (SH) schema shown in SPM_2_1.sql to simulate a SQL workload that would typically appear in a data warehousing application. Before I start the workload, however, Ill initiate the code shown in Listing 2.1. It uses DBMS_SQLTUNE.CAPTURE_CURSOR_CACHE_SQLSET to capture the workloads SQL statements into a SQL Tuning Set named STS_SPM_200.
Packaging and Exporting the SQL Tuning Set. Once Ive captured the SQL workload into a SQL Tuning Set, Ill prepare to transfer it to an Oracle 11gR1 database. Listing 2.2 shows how to:
- Create staging tables as containers for the SQL Tuning Set STS_SPM_200
- Transfer the SQL Tuning Set into those staging tables via procedure DBMS_SQLTUNE.PACK_STGTAB_SQLSET
- Export those populated staging tables via DataPump Export into a dumpset named DumpStagingTable.dmp
Transferring the SQL Tuning Set. After Ive copied the DataPump dump set into the default DataPump directory of my target Oracle 11g database, Ill import the staging tables into the target Oracle 11gR1 database using Oracle DataPump Import and an appropriate parameter file. Ill then use the DBMS_SQLTUNE.UNPACK_STGTAB_SQLSET procedure to unpack the SQL Workload stored in those staging tables. Listing 2.3 shows the details of the transfer process.
Loading the SQL Tuning Set Contents Into SPM. To complete the transformation of the statements stored in the SQL Tuning Set, Ill load these statements directly into the SQL Management Base. Instead of using DBMS_SPM procedures to accomplish this, Ill utilize Enterprise Manager Database Control s SQL Plan Control interface, the link to which is accessed from the Server page:
Figure 2.1. SQL Plan Control Home Panel
Note that the prior SQL Plan Baselines created during my prior experiments are listed here, including their current status and availability. As shown in Figure 2.2 below, Ill select SQL Tuning Set STS_SPM_200 from those available on this panel:
Figure 2.2. Loading a SQL Tuning Set Into the SMB
Once Ive chosen the appropriate SQL Tuning Set and clicked the Load button, Oracle 11g automatically loads the SQL statements from the selected STS directly into the SMB, as shown in Figure 2.3 below:
Figure 2.3. Results of SQL Tuning Set Load
Note that the state of all five SQL Plan Baselines is ENABLED and ACCEPTED, which means that they are immediate candidates for use by the CBO when a SQL statement with a matching hash value is encountered. I can also view the details of the corresponding EXPLAIN PLANs for each SQL Plan Baseline by clicking on the link in the Name column. Heres the result from selecting the baseline named SYS_SQL_685ea4c28ec1a586:
Figure 2.4.1. SQL Statement SPM_2.1.3 Explain Plan, Part 1
Figure 2.4.2. SQL Statement SPM_2_1.3 Explain Plan, Part 2
Proof of Concept. To prove that potential SQL statement performance regression is curtailed or eliminated, Ill now simply execute the same five SQL statements in SPM_2_1.sql and verify that the CBO is indeed choosing the pre-loaded SQL Plan Baselines instead of a newly parsed and less effective execution plan. The easiest way to determine this is to execute the DBMS_XPLAN.DISPLAY_SQL_PLAN_BASELINE procedure for these five statements while passing a value of TYPICAL +NOTE to the FORMAT parameter to request the display of the plan that the CBO has chosen. The +NOTE directive instructs the procedure to display a note if the CBO has indeed selected an existing SQL Plan Baseline for its execution plan. Listing 2.4 shows the results of executing this procedure for the five statements in SPM_2_1.sql.
SPM Scenario #2: Deploying a New Application
Whenever Ive deployed a new application against my production databases, I know that Im essentially at the mercy of my application developers and quality assurance analysts. Even though Ive insisted on tight database object naming standards, strict adherence to PL/SQL best practices, and extensive testing of SQL statements using various combinations of bind variable values, I also know that these guidelines are sometimes unrealistic. When my shops application development team and QA resources cant seem to agree upon a set of standards, or if my senior IT management decides that were spending too much time testing, I know theres a good chance that a newly-developed in-house application will wreak havoc on the performance of a (hitherto) well-tuned Oracle database.
Another vector for potentially poor database performance is those marvelous third-party applications that were sometimes forced to implement by executive fiat. Every experienced Oracle DBA has encountered this situation at least once: An outside application is causing extremely poor performance, and upon investigation, it turns out the application has been written using what I like to call agnostic SQL by an offshore development team whose marching orders are to insure that this application can run on any database in any OS environment. The end result, of course, is SQL whose source code is unavailable for direct tuning because its buried deep in the application layer.
The good news is that Oracle 11g does offer some hope for avoiding degraded performance when deploying a brand-new application in either circumstance:
- For an in-house application, I only need to capture that applications most typical SQL statements into a SQL Tuning Set while its undergoing evaluation in the QA environment.
- For a third-party application, I can either ask the vendor to send me all (or even just the most commonly executed) SQL statements before the application is implemented so that I can capture and evaluate them ahead of time. Another option: I can simply capture the third party applications most common SQL statements while theyre being executed in a QA or development environment directly from the Library Cache.
Once Ive captured the new applications SQL statements, I can use the existing QA or development database to capture their corresponding SQL Plan Baselines, and then transfer those baselines directly to the production databases SMB. The end result? These execution plans will already be present when the application is eventually deployed, thus relieving the CBO of the responsibility for building the execution plans for these statements on the fly when the application is first deployed to production. Best of all, if better execution plans do eventually arise for these statements, the CBO will automatically evolve those improved plans.
Preparing the Simulation. Before I can simulate this scenario, I have some setup tasks to perform. Ive gathered them in Listing 2.5:
- Ill first clear any SQL Plan Baselines from the SMB of my current Oracle 11g database using procedure DBMS_SPM.DROP_SQL_PLAN_BASELINE. This procedure accepts as arguments the SQL handle and plan name and then removes the corresponding SQL Plan Baseline from the SMB. Ive written an anonymous PL/SQL block that utilizes a cursor to capture all SQL Plan Baselines whose SQL text contains a comment like SPM_2, and then feeds the corresponding SQL handle / plan name combinations to the procedure. This method yields fine-grained control of the content of the SMB; Ill use it several times throughout these scenarios to selectively depopulate the SMB when necessary.
- Ill then construct the components that will form the basis for a new Sales Force Administration application. As shown in SFA_Setup.sql, Ill create a new schema owner (SFA), grant it appropriate system and object privileges, and finally create and populate several new tables within this schema.
Capturing SQL Plan Baselines. Ive illustrated the steps to capture a simulated SQL workload for this new application in Listing 2.6:
- To simulate the capture of a sample workload in a testing environment, Ill first insure that the Library Cache and Database Buffer Cache of my current Oracle 11g database is empty before I execute the code shown in SPM_2_2.sql. These six queries all labeled with a comment (SPM_2_2) for easier identification show several different ways a user might decide to combine information about my sales force at the Region, District, and Territory level from the new applications SFA schema with historical sales information in the Sales History (SH) schema. Note that Im also using bind variables generously in these queries so that Ill have the opportunity to evaluate other execution plans in future demonstrations.
- Once the workload is generated and these six queries have been parsed into the Oracle 11g databases Library Cache, its relatively simply to capture their SQL Plan Baselines into the SMB via function DBMS_SPM.LOAD_PLANS_FROM_CURSOR_CACHE. This functions filtering capabilities allow me to capture only those SQL statements whose text contains the comment SPM_2_2.
Exporting SQL Plan Baselines From a Test Environment. As shown in Listing 2.7, Ill export the captured SQL Plan Baselines from my simulated testing environment:
- First, Ill use procedure DBMS_SPM.CREATE_STGTAB_BASELINE to create a SQL Plan Management staging table.
- Ill then use function DBMS_SPM.PACK_STGTAB_BASELINE to populate that staging table with only the SQL statements that have been created by the SYS user.
- Finally, Ill invoke DataPump Export to export the metadata and contents of that staging table.
Importing SQL Plan Baselines Into a Production Environment. To conclude this scenario, Ill simulate the deployment of the SQL Plan Baselines to a production environment (see Listing 2.8):
- First, Ill simply import the SQL Plan Baseline staging table back into my target production environment.
- Since Im using the same Oracle 11g database to simulate both test and production environments, Ill once again use the DBMS_SPM.DROP_SQL_PLAN_BASELINE procedure to clear out any SQL Plan Baselines that contain a comment of SPM_2.
- Ill then reconstitute those baselines directly within my production databases SMB via function DBMS_SPM.UNPACK_STGTAB_BASELINE.
Proof of Concept. Ive verified the successful unpacking by querying data dictionary view DBA_SQL_PLAN_BASELINES. (I used the same query shown in Listing 1.3 in the prior article in this series.) The SQL Plan Baselines for my six target queries appear in the resulting output in Listing 2.9, tagged with an ORIGIN value of MANUAL-LOAD to indicate they originated via DBA intervention instead of the automatic SQL Plan Baseline capture method.
Next Steps
Ive demonstrated how SQL Plan Baselines can effectively limit (if not eliminate) unexpected SQL statement regression during database upgrades, and how to capture SQL Plan Baselines before brand-new application code is deployed to limit plan regression as well. In the next and final article in this series, Ill illustrate another scenario in which SQL Plan Management can help prepare the way for an upcoming database upgrade without having to export SQL Plan Baselines between Oracle 10g and 11g databases. Ill also discuss how to:
- Control the automatic evolution of SQL Plan Baselines
- Employ manual methods to override plan evolution
- Manage and monitor the automatic purging of outmoded SQL Plan Baselines
References and Additional Reading
While Im hopeful that Ive given you a thorough grounding in the technical aspects of the features Ive discussed in this article, Im also sure that there may be better documentation available since its been published. I therefore strongly suggest that you take a close look at the corresponding Oracle documentation on these features to obtain crystal-clear understanding before attempting to implement them in a production environment. Please note that Ive drawn upon the following Oracle Database 11gR1 documentation for the deeper technical details of this article:
B28274-01 Oracle Database 11gR1 Performance Tuning Guide
B28279-02 Oracle Database 11gR1 New Features Guide
B28419-02 Oracle Database 11gR1 PL/SQL Packages and Types Reference
» See All Articles by Columnist Jim Czuprynski