Azure SQL Database Auditing

Contemplating migration of on-premises systems to Microsoft Azure commonly raises security and confidentiality concerns since it is associated, at least to a certain degree, with relinquishing some of the oversight inherent to fully managed environments. While a diminished level of control is certainly a factor to consider, especially when dealing with Platform as a Service resources, such as Azure SQL Database, you have a range of techniques at your disposal allowing you to control and monitor both the status of and access to your Azure-resident services, sufficient to alleviate most of these concerns. One of these techniques is SQL Database auditing, which will be the subject of this article.

You should note that this feature is relatively new, but its relatively recent debut is mainly the reflection of the pace of changes taking place in Azure. After a three-month preview period, it reached general availabilty on November 12, 2014, so it is supported in production deployments. However, it cannot be combined with legacy pricing tiers, so if your SQL database has been configured as either Web or Business, you will first need to upgrade it to Basic, Standard, or Premium. This can be done directly from either of the two Azure portals (from the Scale tab in the classic one or the Pricing tier blade in the Preview portal). Its primary purpose is to implement tracking of database-level security-related events, allowing you to identify instances of unauthorized changes and access attempts, addressing at the same time regulatory compliance requirements. Logging can be selectively turned on or off based on the event types, which include Data Access, Schema Changes, Data Changes, Security Failures, and Grant/Revoke Permissions.

Once enabled, entries represending these events are automatically recorded and stored in the designated auditing table (named SQLDBAuditLogs20140928) within an arbitrarily chosen Azure storage account, which implies that there is an added monetary cost resulting from its usage. There is also a slight (within the range of a few milliseconds) latency impact, but there are no negative implications on database computational performance (since the auditing service employs its own, independent computing resources). When designating the storage account, you will have the option to specify whether to use its primary or secondary key to faciltate secure access to its content. This setting is intended to accommodate scenarios where internal compliance or security procedures necessitate periodic changes of secrets protecting stored content, during which one of the keys gets regenerated, while the other remains unchanged. This way you can ensure that auditing of security-related events continues without interruptions. The underlying mechanism uses the key you selected to automatically generate a Shared Acces Signature (SAS) key, whose access is restricted to the SQLDBAuditLogs20140928 table with write only permissions.

Alternatively, it is also possible to specify event types to be logged on the SQL Server level. Effectively, the resulting settings can be applied in a consistent manner to all SQL Databases hosted on the same SQL Server by selecting the database option to Inherit auditing settings from the server appearing on the Auditing & Security tab in the classic portal and on the Auditing blade in the Preview portal. In either of these cases, you also have the ability to invoke the Save auditing as default command appearing in the command bar of the classic portal (labeled as Save as Default in the Preview portal), which resets the SQL Server auditing settings to match those assigned in the current database.

In addition, in order for auditing to actually take effect, you have to modify the connection strings that client applications and users employ to interact with the SQL Database. The appropriate format can be obtained directly from the portal by clicking on the Show security enabled connection string lnk in either of the portals, which will reveal their values for ADO.NET, ODBC (includes Node.js), PHP, and JDBC-based methods. They differ from their original counterparts in one aspect only, which involves adding the word secure in the portion of the string identifying the target SQL Server. For example, if the original connection string for the ADO.NET connection was in the form of Server=tcp:server_name.database.windows.net,1433;Database=AdventureWorks2012;User ID=login_name;Password=login_password;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;, where server_name, login_name, and login_password represent, respectively, the SQL Server name, login name defined in the master database, along with its password, then the new connection string that is a subject to the auditing settings becomes simply Server=tcp:server_name.database.secure.windows.net,1433;Database=AdventureWorks2012;User ID=login_name;Password=login_password;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;. Keep in mind that audited connections, just like the non-audited ones, must comply with firewall rules.

Obviously, this does not suffice to ensure full compliance with your auditing requirements, since by default the original connection strings still remain valid. To remediate this potential security gap, you can disable them by simply toggling the Security Enabled Access switch from Optional to Required in either of the two portals.

Once auditing has been enabled and enforced, the events will start to accumulate in the SQLDBAuditLogs20140928 table in the storage account of your choice. You can view its raw content by leveraging any of the existing utilities that provide direct access to Azure table storage. Their comprehensive listing is available on MSDN Blogs. In addition, you might want to take advantage of the preconfigured dashboard template, which you can download by clicking on the Download audit logs report template link on the Auditing & Security tab in the classic portal or the Open in Excel link on the Auditing blade in the Preview portal. The template includes interactive and customizable Power View and PowerPivot-based reports, leveraging Business Intelligence capabilities in Excel 2013 that considerably simplify analysis of the audit log data. The template contains sample data representing a three-month period of auditing seven SQL Databases.

Once you download the template, you will also need to download and install the Power Query add-in for Excel. At that point, you can step through the process documented in the 02-Azure SQL DB Audit Logs Excel Report How-To.docx Word document referenced in the Excel workbook. This will involve configuring the Source of events, pointing to the SQLDBAuditLogs20140928 table in the designated storage account, potentially adjusting the records count limit (its default is 1 mln but the value can be increased to 6 mln if needed). The workbook consists of multiple worksheets, including the following (note that only the first five of them are visible – the remaining ones can be displayed by using the Unhide entry in the context sensitive menu of any visible worksheet):

  • Orientation– includes hyper-linked instructions enumerating the steps required to configure the template and a brief description of the reports associated with each of the visible worksheets.
  • Anomalies– displays a summary of events that warrant further investigation, such as large data set changes or invalid login attempts, along with rarely used security principals. It also provides a chart representing the percentage of data access failures, which might be indicative of improper application usage.
  • Drill Down– offers the ability to obtain more detailed statistics representing different aspects of audited events, such as event type, target database, or security principal name used to access it.
  • Event Type Distribution– offers a comprehensive view of the distribution of audited event types based on several dimensions, such as target database, month and day, weekday, or hour.
  • Event Time Analysis– assists with the analysis of event types over a period of time. By default, it consists of two charts – the first one displaying the volume of data access, data change, and stored procedure execution events and the second one referencing sucessful logons. The event types in scope can be easily added or removed by using event type slicers configurable independently for each chart.
  • Database Locations– contains the list of database locations (corresponding in our case to Azure data center locations). This must be filled out manually after the worksheet is unhidden.
  • Investigation View– generates a map-based visualization based on the Database Locations spreadsheet. It is intended to help identify trends in event types.
  • Time Analysis Upper Pivot and Time Analysis Lower Pivot– serve a supporting role for the Event Type Analysis spreadsheet and should not be modified.
  • Weekdays– serves a supporting role for the Event Type Distribution worksheet and should not be modified.
  • Events Over Time– provides total event counts over a period of time; useful in predictive analytics.
  • Data Access Over Time, Data Changes Over Time, Successful Logins Over Time, Schema Changes Over Time, and Stored Procedure Calls Over Time – provide, respectively, data access, data change, successful logon, schema change, and stored procedure call event counts over a period of time; useful in predictive analytics.

Note that it is relatively common for Web applications to establish a connection to a database on behalf of their users by using the same set of credentials, configured on the application level. While all such connections will result in log entries representing the same security principal, you might be able to capture the name of the actual application user by incorporating it into the parameterized request sent to the SQL Database.

This concludes our overview of SQL Database auditing functionality. In our upcoming articles we will be exploring other, recently announced enhancements to this PaaS service.

See all articles by Marcin Policht

Marcin Policht
Marcin Policht
Being a long time reader, I'm proud to join the crowd of technology gurus gathered here. I have a fair share of Microsoft exams behind me, a couple of acronyms after my name - MCSE, MCSD, and MCT, decent familiarity with majority of MS BackOffice products (SMS, SQL, Exchange, IIS), programming and scripting languages (VB, C++, VBScript with wsh) and several years of practical experience with Windows environment administration and engineering. My focus these days is on SQL Server and Windows 2000, and I'll attempt to share the most interesting experiences with these products.

Get the Free Newsletter!

Subscribe to Cloud Insider for top news, trends & analysis

Latest Articles