Use the Reporting Services Execution
Log as a source for performance and access reporting. In this article, MSAS
Architect Bill Pearson leads practice in establishing the log as a data source
for administrative reporting.
About the Series ...
article is a member of the series MSSQL Server 2000 Reporting Services.
The series is designed to introduce MSSQL Server 2000 Reporting Services ("Reporting
Services"), with the objective of presenting an overview of its features,
together with many tips and techniques for real-world use. For more
information on the series, as well as the hardware / software requirements
to prepare for the exercises we will undertake, please see my initial Database Journal article, A New Paradigm for Enterprise Reporting.
assumptions underlying the series are that you have correctly installed
Reporting Services, including Service Pack 1, along with the applications
upon which it relies, and that you have access and the other rights /
privileges required to complete the steps we undertake in my articles. For
details on the specifics of the adjustments necessary to quickly allow full
freedom to complete the exercises in this and subsequent articles, as well as
important assumptions regarding rights and privileges in general, please see
earlier articles in the series, as well as the Reporting Services Books
article also relies upon sample files that are installed along with Reporting
Services in a "typical" scenario. If the samples have not been
installed in, or were removed from, your environment, the samples can be found
on the Reporting Services installation CD.
About the BlackBelt Articles ...
have stated in earlier BlackBelt articles, one of the greatest
challenges in writing tutorial / procedural articles is creating each article
to be a freestanding document that is complete unto itself. This is important,
because it means that readers can complete the lesson without reference to
previous articles or access to objects created elsewhere. When our objective
is the coverage of a specific technique surrounding one or more components of a
report, a given administrative function surrounding all reports, and other
scenarios where the focus of the session is not the creation of reports,
per se, can be challenging because a report or reports often has to be in place
before we can begin to cover the material with which the article concerns
articles represent an attempt to minimize the setup required in simply getting
to a point within an article where we can actually perform hands-on practice
with the component(s) under consideration. We will attempt to use existing
report samples or other "prefabricated" objects that either come
along as part of the installation of the applications involved, or that are
readily accessible to virtually any organization that has installed the
application. While we will often have to make modifications to the sample
involved (we will actually create a copy, to allow the original sample to
remain intact), to refine it to provide the backdrop we need to proceed with
the object or procedure upon which we wish to concentrate, we will still save a
great deal of time and distraction in getting to our objective. In some cases,
we will have to start from scratch with preparation, but my intention with the BlackBelt
articles will be to avoid this, if at all possible.
more information about the BlackBelt articles, see the section
entitled "About the BlackBelt Articles" in BlackBelt
Components: Manage Nulls in OLAP Reports.
optimization of RS' performance is, beyond argument, one of the more important
functions of the Administrator. In evaluating performance from various
perspectives at the Administrative level, one readily useful source of
information is the data we can obtain from the logs created by the system
itself. Reporting Services generates a number of log files to capture
information about server operations, status, and so forth. Within this group
of logs, which we will explore individually within prospective articles within
our series, the Report Server Execution Log is a great place to start in
setting up a basic performance and auditing analysis capability.
Log captures data specific to
individual reports, including when a given report was run, identification of
the user who ran it, where the report was delivered, and which rendering format
In this session, we
Logging in general, and the steps involved in creating a useful data source
for reporting purposes from the log as it appears in its native form;
potential complementary functions we can leverage from implementing Execution
database to store the Execution Log data;
the location of the files provided with Reporting Services to assist us in
transforming Execution Log data to a useful reporting data source;
Open and execute the table
creation script provided to create the schema for our new reporting database;
execute the DTS package provided to transform the Execution Log
data and to populate the new database tables.
Prepare the Execution Log for Reporting
Objective and Business Scenario
Log contains data surrounding the reports that we execute on our server(s).
The information the log provides includes details that can assist us in
determining report efficiency and frequency of usage, together with myriad
facts that assist in resource planning, monitoring the activities of users, and
so forth. The information we can obtain from the Execution Log includes:
The name of
the Report Server Instance that handled a given request;
The User Identifier;
Type (either User or System);
Format of the report;
values used for the execution of a given report;
The Start and
Stop times of a given report process, from which duration is
of Processing Duration spent retrieving the data, processing the
report, and rendering the report;
(including Live, Cache, Snapshot, or History) of
Status of Processing for a given report (indicating
either a successful process, or the code of the first Error Condition
of each generated report.
of rows returned from query(ies) underlying each report.
Reporting Services logs
report execution details into an internal database table that, in its pristine
form, is of limited use to us in administrative reporting. The good news is
that, with the installation of Reporting Services, we are provided tools to
help us to extract the data we need to be able to report effectively on the
details we have mentioned. In this article, we will discuss these tools and
how we can use them to transform the raw data found in the Execution Log
into a form that we can easily access and query to produce various reports to
support us in the analysis and administration of Reporting Services.
For purposes of our
practice procedure, we will assume that information consumers at a client
location have expressed the need to monitor report activity, for purposes of
auditing and performance evaluation. The consumers have asked that we work
with them to create a database that is populated and maintained with this data
for many potential uses. We accept the project, and state that, while we have
created numerous relatively elaborate databases of this sort for clients in the
past, our initial pilot will include the Execution Log reporting capabilities
that can be implemented easily, from tools that accompany the installation of
While we certainly
plan to expand the capabilities to many other measures for overall performance
and auditing monitoring, we like to begin with this step, we explain, because
Reporting Services provides a script to build the basic reporting data source
for the Execution Log that it generates, as well a few basic sample
reports that we can run as soon as we transform data from the log to the newly
created database. We assure the client that we have found this to be a "quick
win" with regard to getting basic functionality in place, in many cases
(including this one) at a great time in the Reporting Services implementation -
just as we are beginning to write our first enterprise reports. The sample
reports serve several complementary functions in this scenario, including:
Provision of a
means for the Administrator to immediately see who is performing what actions
within the new system;
an excellent learning tool for new report authors, whereby they can obtain
confirmation that their new reports execute;
- Presentation of basic report
samples, which their ultimate users can base requests for customization. This
can save a great deal of time in obtaining the requirements from users who,
although they know they need these reports, may not know "where to start"
in providing a useful specification for their design.
- Facilitation of optimization of
reports and their underlying queries, both at times during and after creation,
and at various points in time as a part of ongoing system upkeep and
maintenance (tuning can be suggested, for example, as data sources grow in
size, as number of information consumers increase, security needs change, etc.)
Having imparted our
ideas to the client team, confirming our understanding of their needs as part
of the process, we begin the setup of the Execution Log reporting.
Considerations and Comments
Before we can work
with the Execution Log data, we need to take several preparatory steps
to make the data it contains useful from a reporting perspective. Our
preparation will include executing a DTS package that Reporting Services provides
to extract the data from the Execution Log and put it into a table
structure that we can easily query. As we noted earlier, the internal table in
the report server database does not present the data in a format that is
readily user-friendly. The DTS package included with Reporting Services
resolves this problem by collecting all of the data we need and putting it into
a table structure that is intuitive in its organization.
As part of our setup, we will be
copying files, creating a database, and performing other actions. To complete
the procedure you will need to have the appropriate access and privileges, at
the MSSQL Server 2000 level, within Reporting Services, MSSQL Server Analysis
Manager, and elsewhere, to perform the respective actions. Setup is a one-time
event, unless we decide to recreate the table elsewhere, or to otherwise modify
the simple procedures to accommodate local requirements, and so forth.
have done in previous articles, we will make copies of the existing components
we use, so as to preserve the originals in a pristine condition, in case we
wish to perform (or are already performing) activities that require a "fresh
Create a Folder and
Populate it with the Components We Will Need
prepare for our session with the Execution Log, we will create a folder
to house the DTS package to which we have referred earlier, along with
other files that we will need to complete our practice exercise. Working with
copies of the files will mean that we can comfortably proceed in a practice
environment without concern about "returning things to the way they were,"
should we later wish to perform similar exercises in our development or
production environments (where it would obviously be appropriate to maintain
proper source control over the respective objects). The folder can be created
anywhere on the PC that is convenient from a local perspective.
to launch Windows Explorer.
Navigate to a
convenient place to create the folder that will contain the "collateral"
for our practice session.
folder named as follows:
folder appears similar to that depicted in Illustration 1.
Illustration 1: New
Folder in Place to House Lesson Components
the following folder (or it's equivalent, if your installation differs from
C:\Program Files\Microsoft SQL Server\80\Tools\Reporting Services\ExecutionLog
folder should contain
the following four files:
select all four files noted above.
the highlighted selection.
from the context menu that appears, as shown in Illustration 2.
Illustration 2: Select
Copy from the Context Menu
and enter, the RS013 folder we created above.
from the context menu that appears, as depicted in Illustration 3.
Illustration 3: Placing
Copies of the Components into the New Folder
the four files appear in the new folder.
now ready to create a database to house the data from the Execution Log.
In doing so, we will make use of the files we have placed in the folder we
created in this step.
Create a Database to Contain the Execution Log
next step is to create a database to contain the information from the Execution
Log. As we have discussed, Reporting Services creates a log to house this
information already - a log that we might simply access, and from which we
might report, without the need for an "intermediate" database. The
information that we find in the default database leaves a little to be desired
from the perspective of formatting, ease of reporting and so forth. We will
therefore take a gift as it is presented, and use the tools provided to create
a database and user-friendly schema to make our job easier. We can, after all,
modify the database as required in the future, should the need arise, based
upon client feedback or our own inclinations.
begin the construction of our reporting data source, we will first create a new
database from the MSSQL Server Enterprise Manager, and then create the
constituent tables of the database with Query Analyzer. This step will
be facilitated by the table creation script that resides among the components
that we placed into the folder we created above.
the Microsoft MSSQL Server Enterprise Manager shortcut in the Programs
group, as appropriate.
equivalent on my PC appears as shown in Illustration 4.
Illustration 4: Navigating
to the Enterprise Manager
Manager from the menu.
Manager console opens.
Microsoft SQL Servers by clicking the "+" sign to its
immediate left, as required.
Manager console appears as depicted in Illustration 5.
Enterprise Manager (Compressed) View
Expand the Databases
folder, so as to display its contents.
the Databases folder.
Database ... in the context menu that appears, as shown in Illustration 6.
Illustration 6: Select
New Database ...
Properties dialog appears, defaulted to the General tab.
following into the Name box:
tab of the Database Properties dialog appears
as depicted in Illustration 7.
Illustration 7: The
Database Properties Dialog - General Tab
settings on the Data Files and Transaction Log tabs to place the
files in the file system location you desire (I left mine at default for this
to accept changes and to close the Database Properties dialog.
dialog closes and the database appears (after refreshing the Enterprise
Manager console, as required).
Manager open, click the Start button once again.
the Microsoft MSSQL Server - Query Analyzer shortcut in the Programs
group, as appropriate.
equivalent on my PC appears as shown in Illustration 8.
Analyzer from the menu.
to SQL Server dialog appears.
appropriate server in the SQL Server selector box.
radio button to the left of the appropriate authentication selection, supplying
login credentials as required.
to SQL Server dialog appears (with my settings for this exercise) as
depicted in Illustration 9.
Illustration 9: Connect
to SQL Server Dialog
Click the OK
button to log in to SQL Server via Query Analyzer.
... message briefly appears, and then the Query Analyzer console opens.
in the Database Selector atop the console.
--> Open from the main menu.
the folder we created above, RS013, in the Open Query File dialog
following file within the RS013 folder:
Query File dialog appears similar to that shown in Illustration 10.
Illustration 10: Selecting
the Table Creation Script
on the Open Query File dialog to open the script.
table creation script opens, and appears in the Editor pane of the
Query window. This script, executed within the new RSExecutionLog
database, will create the tables that we will target for the Execution Log
data in later steps.
Click the Execute
Query button on the toolbar, as depicted (circled) in Illustration 11.
Illustration 11: Executing
the Query ...
NOTE: Had we named the database anything other than RSExecutionLog,
or if we were working with any but the local instance of MSSQL Server,
we would have needed to edit rsexecutionlog_update.ini for the
respective Execution Log database (referenced as "[Destination]"),
and / or Server identification, etc.. Modifications can be accomplished
by accessing rsexecutionlog_update.ini (another of the files we copied
into the RS013 folder above), using Notepad or any other text editor, as
we shall see later in our exercises.
script is executed, and we are informed of its successful completion, along
with the various events that have transpired, in the Messages pane of
the Query window, as shown in Illustration 12.
Illustration 12: Indication
of Successful Processing ...
At this point,
we can verify creation of the tables by shifting back to Enterprise Manager,
expanding the RSExecutionLog database we created earlier, and clicking
on Tables. Our tables appear as depicted in Illustration 13.
Illustration 13: Manual Verification
of Creation of Tables (Optional)
--> Exit to close Query Analyzer.
preparation of the target database is now complete. We are ready to populate
the tables with the contents of the Execution Log, which we will
accomplish in the next section.
Log Data, and Populate the New Reporting Database
will return to Enterprise Manager at this point, where we will extract
the data existing in the Execution Log. To achieve our objective, we
will use another tool, a DTS package, provided with Reporting Services in
the files that we copied into our RS013 folder earlier.
will begin by returning to Enterprise Manager.
Return to the Enterprise
Manager console we left open earlier.
Package from the context menu that appears, as shown in Illustration 14.
the folder we created above, RS013, in the Select File dialog
following file within the RS013 folder:
File dialog appears similar to that depicted in Illustration 15.
Illustration 15: Selecting
the Reporting Services Execution Log Update DTS Package
Select the DTS
package, and click Open on the Select File dialog to open it.
Package dialog appears.
as shown in Illustration 16.
Illustration 16: Specifying
the RSExecutionLog_Update DTS Package
on the Select File dialog to open the DTS package.
package appears in the DTS Designer, as depicted in Illustration
Illustration 17: The RSExecutionLog_Update
DTS Package Appears
--> Properties from the DTS Designer main
menu, as shown in Illustration 18.
Illustration 18: Select
Package --> Properties ...
The DTS Package Properties dialog appears, defaulted to the General tab.
10. Click the Global Variables tab.
11. Click the Value box that corresponds to the Global Variable named sConfigINI.
12. Input the full path with file name for the rsexecutionlog_update.ini file that we copied, along with the other files provided by Reporting Services for Execution Log reporting, into our RS013 folder, in the Preparation phase of our exercise.
As an example, for my own environment, I enter the following path:
The DTS Package Properties dialog - Global Variables tab appears, with partially visible sConfigINI Value box entry, as depicted in Illustration 19.
13. Click OK to accept our addition, and to close the DTS Package Properties dialog.
We return to the DTS Designer.
14. Select Package --> Execute from the DTS Designer main menu, as shown in Illustration 20.
Illustration 20: Select Package --> Execute ...
The DTS Package executes, displaying step-by-step status messages within the Executing DTS Package viewer that appears next. Once the process is completed, we are greeted with a Package Execution Results message box, indicating successful completion, as depicted in Illustration 21.
Illustration 21: Success is Indicated - With Step-by-Step Statuses
15. Click the OK button on the Package Execution Results message box to close it.
16. Click Done to close the Executing DTS Package viewer.
We return to the DTS Designer.
17. Select Close under the Enterprise Manager icon in the upper left corner of the DTS Designer, as shown in Illustration 22 (NOT Exit under File in the adjacent main menu).
Illustration 22: Closing the DTS Designer to Return to Enterprise Manager
A DTS Designer message box appears, asking if we wish to save our changes to the DTS package, as depicted in Illustration 23.
Illustration 23: DTS Designer Message Box Asks If Changes Are to Be Saved ...
18. Click Yes to save our changes, and to close the message box and Executing DTS Package viewer.
We return to the Enterprise Manager console. A quick query or online browse of the new tables of the database that we have assembled will confirm the fact that they are, indeed, populated. The database we have created is "reporting ready," as we will see in our next session. A simple reverse-engineer of the database from MS Visio generates the schema shown in Illustration 24.
Illustration 24: Simple Database Diagram (MS Visio) of the New Reporting Database
In our next article, we will examine the uses to which the new database might be put. We will perform setup and publication of the sample reports provided with Reporting Services as a "starter set," and then go beyond that set and create a customized report to show the ease with which we might help the information consumers we support to meet general and specific needs. We will propose other considerations that will add value to this already rich resource, and discuss ways in which we can leverage Execution Log reporting to make us better report writers from multiple perspectives.
In this article, we introduced Execution Logging, discussing its nature in general, and several ways in which it can assist us in understanding the performance of our reports, the actions of users, and a host of other details about the reports we create in Reporting Services. We conducted a brief preview of the steps involved in establishing the capability to perform Execution Log reporting, using the tools provided as samples with the Reporting Services installation. We then set the scene for our practice example, describing a hypothetical scenario where we determine that we can provide multiple benefits to the information consumers within a client organization by "installing" the Execution Log reporting capability.
We discussed reasons for creating a reporting database as opposed to simply using the Execution Log in its original state, then set about creating and populating the database. After copying the sample files included with Reporting Services to make this process easier for us, we opened and executed the table creation script to create the schema for our new reporting database. We then loaded and executed the DTS package provided to transform the Execution Log data and populate the new database tables, paving the way for exercises in reporting from the Execution Log data, which we will begin in our next article.
» See All Articles by Columnist William E. Pearson, III
Discuss this article in the MSSQL Server 2000 Reporting Services Forum.