We have come a long way from Oracles old statspack reports.
Remember way back. Ok, maybe not too way back, when we would run the old statspack reports. A DBA would run begin-stats, wait a while, and then run end-stats. At the end of the end-stats routine a report would kick out that would show you all the accumulated statistics for the time period between begin-stats and end-stats. I know a lot of DBAs that just loved this. The information was fine but what happened to the statistics after the report was generated? The data was gone since the beginning of the begin-stats would truncate or delete all the tables that held the statistical data from the last time the procedures where executed. The data was very short lived.
Enter in the age of snapshots and what Oracle has called the Automatic Workload Repository (AWR). Now a DBA can schedule a collection procedure (snapshot) to instruct Oracle to take the end-stats. These statistics are now stored in the database along with a predefined number of other snapshots. DBAs can now just give AWR more space and increase the retention for data and now statistics can be held around for as long as desired. This is a GREAT improvement and I hope you can see the benefit. Now, for a database, DBAs can go back in time and see exactly how a database was performing at a particular point-in-time. In addition, DBAs can compare two different points-in-time to see what might have changed or if tuning attempts actually produced performance benefits. Very powerful!
The next logical step is to want to compare point-in-time snapshots from two different databases. Up until now, this was impossible and reports could only be run against a single database. This is where Oracles Transportable Automatic Workload Repository comes in. Now a DBA can take a set of statistics from one database and transport them into another. This could be into a database that is currently doing some task (production/test/QA/stress) or as I would suggest into an actual repository whose sole purpose is to house AWR data. You decide, but by creating just a repository for extracted AWR data, you in essence are able to perform detailed analysis without impacting your production systems.
Oracle Data Pump & DIRECTORY Objects
Before we get into the example, be aware that the utilities provided by Oracle to transport AWR data use Data Pump and DIRECTORY objects. By default, after installation, Oracle has a DATA_PUMP_DIR already created that can be used for the procedures. This can be seen by the following query.
SQL> select * from DBA_DIRECTORIES where directory_name = 'DATA_PUMP_DIR'; OWNER DIRECTORY_NAME DIRECTORY_PATH ----- --------------- ------------------------------------------------------ SYS DATA_PUMP_DIR /opt/app/oracle/product/10.2.0/db_1/ admin/db01/dpdump/
For those that are security conscious and might have dropped this object, one can be created by the following.
SQL> CREATE DIRECTORY DATA_PUMP_DIR AS '/opt/app/oracle/product/10.2.0/db_1/admin/db01/dpdump/';
You can actually name your DIRECTORY object anything you wish and use any directory. The prior examples are just a starting point. Just keep in mind that you will need to grant the following authorizations to who ever uses Data Pump. Not a big issue as the transportable routines are supposed to be run as the sys user but thought I would mention them here.
SQL > GRANT EXP_FULL_DATABASE to <user>; SQL > GRANT READ, WRITE ON DIRECTORY datapump to <user>;
How to Extract AWR Data
Oracle provides a script called awrextr.sql that will prompt for the various pieces of information that distinguish a unique snapshot from the database. This script needs to run as the sys user and on a machine that has access to the $ORACLE_HOME/rdbms/admin directory. Simply log into the database as the sys user and invoke the script. The output from the beginning of the awrextr.sql script says it all on what will be expected to complete the script successfully.
~~~~~~~~~~~~~ AWR EXTRACT ~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~ This script will extract the AWR data for a range of snapshots ~ ~ into a dump file. The script will prompt users for the ~ ~ following information: ~ ~ (1) database id ~ ~ (2) snapshot range to extract ~ ~ (3) name of directory object ~ ~ (4) name of dump file ~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
During the run of the awrextr.sql script, Oracle intellectually displays vital information that you might want to use for the four different prompts above. However, after entering this information into Oracle, the extract routine takes on the look of a typical Data Pump run where it shows you the objects being extracted along with row counts. Depending on the number of snapshots you plan to extract, this script could run for a bit. If you plan to transport data between systems, I would suggest you do it in reasonable chunks.
How to Load AWR Data
Loading the extracted AWR data is just as easy as the extraction. Oracle provides a script called awrload.sql that will also prompt for some basic information. This script, though, does not load directly into the sys schema. It will load, via Data Pump, into a staging schema. At first I thought this was a bit silly but have grown to appreciate the fact that things can and will go wrong. If something does, you will be glad you used a staging area, as the cleanup of AWR snapshot data is sometimes tricky. In addition, if your two databases dont share the same DATA_PUMP_DIR you will have to copy the previously extracted file to the DATA_PUMP_DIR of the target database. Again, simply log into the target database for loading your AWR stats as the sys user and invoke the script. The output from the beginning of the awrload.sql script says it all on what will be expected to complete the script successfully.
~~~~~~~~~~ AWR LOAD ~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~ This script will load the AWR data from a dump file. The ~ ~ script will prompt users for the following information: ~ ~ (1) name of directory object ~ ~ (2) name of dump file ~ ~ (3) staging schema name to load AWR data into ~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The item that is not mentioned here is the fact you will be creating a new user for the staging schema. As such, you will be prompted for a DEAFULT TABLESPACE and a TEMPORARY TABLESPACE for this staging user. As a precautionary measure, you might want to check that the user you use for the staging schema is not already being used. There is a drop at the end of this script and I would hate for the script to not only load bogus information into that schema but also possibly drop the whole schema at the end.
Note: Dont use .dmp on the end of your files when prompted for them
What to DO NOW!
Most everyone is familiar with running AWR reports on a single instance. NOW we have multiple instances in a shared AWR repository. Yes, you can easily run the same single instance reports you have grown to love but it is much more interesting to compare various instances that might be used for the same purpose. Suppose you are in a development shop where you continually run an application through QA. Often times you need to tune the database and rerun a QA test. After doing this and loading the AWR data into a repository you can now compare the two runs to see if there were any benefits to performance. The script you want to use for this is called awrddrpi.sql. It will prompt you to specify two database instances, or they could be the same, and then prompt you for the snapshot time periods to be compared. I have provided two snippets for your enjoyment to see the power of comparing different AWR information.
Where to go from here
Since all of these utilities have been initiated from scripts in the $ORACLE_HOME/rdbms/admin directory I always like to go and read those scripts to really figure out what they are doing. More often than not, there are jewels to be had where you investigate how to call and perform the same functionality through your own scripts or create PL/SQL procedures that will do this for you. The fact that these scripts extract and load through Data Pump mean there is no reason you couldnt internalize the same routines in a set of PL/SQL packages that are tailored for your own environment. Here are a couple of the packages/procedures called. I am sure you can now determine what they might do.
Honestly, I think the introduction of Oracle RAC really had to make Oracle think about how their previous utilities had been written. No longer do we as DBAs want only to interact with a single database instance. We want to compare, contrast, and do general investigative analysis across databases. I am personally glad RAC is around if nothing more than to get some more power into these utilities.
Note: Oracle Database 10g Licensing Information