Jan 022018
 

Every now and again as a Microsoft PFE, you get a chance to make a big difference for a customer. One such occasion happened just recently when I was asked to help find a way to automate the daily checks the DBA had to perform every morning. The result was a PowerShell script that reduced that manual task down from an hour to less than a minute.

You can read the full article here on MSDN.
https://blogs.msdn.microsoft.com/samlester/2017/12/29/sql-server-dba-morning-health-checks/

The PowerShell script can be downloaded from here.
https://github.com/PatrickKeisler/SQLMorningHealthChecks

Apr 182017
 

In February 2017, Microsoft announced the general availability of Blob Auditing for Azure SQL Database. While auditing features were available before in Azure, this is a huge leap forward, especially in having more granular control over what audit records are captured.

Before Blob Auditing, there was Table Auditing. This is something I like to equate to the C2 auditing feature of SQL Server. It’s only configurable options were ON or OFF. In reality, Table Auditing has a few more controls than that, but you get the idea. There was no way to audit actions against one specific table. Blob Auditing provides us with that level of granularity. However, controlling that granularity cannot be accomplished through the Azure Portal; it can only be done with PowerShell or REST API.

In the image below, you can see that Blob Auditing is on, but we can not see what actions are being collected.


Using PowerShell, we can easily see the default audit action groups.

Get-AzureRmSqlDatabaseAuditingPolicy `
  -ServerName 'imperialwalker' `
  -DatabaseName 'Lahman2015' `
  -ResourceGroupName 'MCPLABv2'


We can see there are three action groups listed: SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP, FAILED_DATABASE_AUTHENTICATION_GROUP, and BATCH_COMPLETED_GROUP. Without even looking at the documentation, we can assume that we are auditing both successful and failed logins, as well as all successful batches against this database. When you compare the Azure action groups side-by-side with the box product, they line up almost exactly.

So how do we customize it further? Well let’s say our auditing requirements only need to capture changes to structure of the database; for example, an ALTER TABLE. First, we need to remove BATCH_COMPLETED_GROUP and add DATABASE_OBJECT_CHANGE_GROUP. To accomplish this, we will use Set-AzureRmSqlDatabaseAuditingPolicy.

Set-AzureRmSqlDatabaseAuditingPolicy `
  -ServerName 'imperialwalker' `
  -DatabaseName 'Lahman2015' `
  -ResourceGroupName 'MCPLABv2' `
  -AuditActionGroup `
     'SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP' `
    ,'FAILED_DATABASE_AUTHENTICATION_GROUP' `
    ,'DATABASE_OBJECT_CHANGE_GROUP'

To verify the changes were successful, we run Get-AzureRmSqlDatabaseAuditingPolicy again.


Now, we’ll be able to collect audit records anytime a CREATE, ALTER, or DROP is executed against a database object. However, let’s say we need something more granular. In our sample database, we have a table that stores salary data and we need to audit anything that touches it. We ae already covered with schema changes by the action group, DATABASE_OBJECT_CHANGE_GROUP, but that doesn’t audit DML changes. Adding BATCH_COMPLETED_GROUP would capture what we need, but that would cover all tables and we have a requirement for just one. This is where we can audit actions on specific objects. In the statement below, we just add an audit action for SELECT, INSERT, UPDATE, and DELETE on the Salaries table.

Set-AzureRmSqlDatabaseAuditingPolicy `
  -ServerName 'imperialwalker' `
  -DatabaseName 'Lahman2015' `
  -ResourceGroupName 'MCPLABv2' `
  -AuditActionGroup `
     'SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP' `
    ,'FAILED_DATABASE_AUTHENTICATION_GROUP' `
    ,'DATABASE_OBJECT_CHANGE_GROUP' `
  -AuditAction 'SELECT, INSERT, UPDATE, DELETE ON dbo.Salaries BY public'

To verify the changes were successful, we run Get-AzureRmSqlDatabaseAuditingPolicy again.


If you have multiple objects or actions to audit, then just separate them with a comma, just like the AuditActionGroups parameter. The one key piece to remember is you must specify all audit actions and action groups together with each execution of Set-AzureRmSqlDatabaseAuditingPolicy. There is no add or remove audit item. This means if you have 24 actions to audit and you need to add one more, then you have to specify all 25 in the same command.

Now let’s run a few queries to test the audit. First, we’ll run a simple select from the Salaries table.

SELECT TOP 10 * FROM dbo.Salaries;
GO

Next, we’ll create a view that selects from the Salaries table.

DROP VIEW IF EXISTS dbo.PlayerSalaryByYear;
GO
CREATE VIEW dbo.PlayerSalaryByYear
AS
SELECT
   m.nameLast + ', ' + m.nameFirst AS 'Player'
  ,s.yearID
  ,t.name
  ,s.salary
FROM dbo.Salaries s JOIN dbo.[Master] m ON m.playerID = s.playerID
JOIN dbo.Teams t ON s.teamID = t.teamID AND s.yearID = t.yearID;
GO

Finally, we’ll select from that view.

SELECT * FROM dbo.PlayerSalaryByYear
WHERE Player = 'Jones, Chipper';
GO

Back in the Azure Portal, click on the view button so we can view the captured audit records for the statements we just executed.


What is displayed is one line item for each audit record captured.


Selecting each record will open another blade with the contents of that record. In our example, we have one for the initial SELECT against the table, one for the CREATE VIEW statement, and one for the SELECT against the view which references the Salaries table.


While using the Azure Portal is a quick and easy way to view audit records, the best way to consume the records for reporting is to use the function, sys.fn_get_audit_file(). This is the same function used in the box product. The only difference is we need to specify the Azure URL for the audit log. All audit logs are stored in a container named sqldbauditlogs. In side that container, additional containers server name, database name, and a date/time stamp are created to further organize it. This is something to keep in mind if you plan to programmatically process the audit records.

SELECT *
FROM sys.fn_get_audit_file ('https://mcplabv2storage.blob.core.windows.net/sqldbauditlogs/imperialwalker/Lahman2015/SqlDbAuditing_Audit_NoRetention/2017-04-17/13_48_34_960_0.xel',default,default);
GO


If you don’t know the URL full path, you can use the Azure Storage Explorer to help find it.

As of this writing, there are two DMVs missing from Azure SQL Database: sys.dm_audit_actions and sys.dm_audit_class_type_map. These DMVs allow us to translate the actions_id and class_type values into a readable description. Since they are not available in Azure, I have created my own version of those as user tables within my database: dbo.audit_actions and dbo.audit_class_types. This allows me to join them against the audit function to produce a better report.

SELECT
   a.event_time
  ,aa.name AS 'action_name'
  ,c.securable_class_desc AS 'securable_class'
  ,c.class_type_desc AS 'class_type'
  ,a.statement
  ,a.client_ip
  ,a.application_name
FROM sys.fn_get_audit_file ('https://mcplabv2storage.blob.core.windows.net/sqldbauditlogs/imperialwalker/Lahman2015/SqlDbAuditing_Audit_NoRetention/2017-04-17/13_48_34_960_0.xel',default,default) a
LEFT JOIN dbo.audit_actions aa ON a.action_id = aa.action_id
LEFT JOIN dbo.audit_class_types c ON a.class_type = c.class_type;
GO


If you are familiar with auditing in the box product, then you might be aware that common properties like client hostnames (or IP address) and application names are not captured for each audit record; however, in Azure they are collected and viewable in the columns client_ip and application_name. See the picture above.

Next, let’s create a stored procedure that selects from that view and add an EXECUTE audit action for it.

DROP PROCEDURE IF EXISTS dbo.usp_PlayerSalaryByYear;
GO
CREATE PROCEDURE dbo.usp_PlayerSalaryByYear(@playerName varchar(100))
AS
SELECT * FROM dbo.PlayerSalaryByYear
WHERE Player = @playerName;
GO

Now to add the EXECUTE audit action via PowerShell.

Set-AzureRmSqlDatabaseAuditingPolicy `
  -ServerName 'imperialwalker' `
  -DatabaseName 'Lahman2015' `
  -ResourceGroupName 'MCPLABv2' `
  -AuditActionGroup `
     'SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP' `
    ,'FAILED_DATABASE_AUTHENTICATION_GROUP' `
    ,'DATABASE_OBJECT_CHANGE_GROUP' `
  -AuditAction `
     'SELECT, INSERT, UPDATE, DELETE ON dbo.Salaries BY public' `
    ,'EXECUTE ON dbo.usp_PlayerSalaryByYear BY public'

Using Get-AzureRmSqlDatabaseAuditingPolicy, you can see the additional audit action that was added.


Now we need to execute the stored procedure to test.

EXECUTE dbo.usp_PlayerSalaryByYear 'McGriff, Fred';
GO

The Azure Portal shows the two audit records that were captured; one for the execute of the stored procedure and the second for the select on the underlying table.


Using the query below, we can extract some more useful data from the additional_information column. This will show the nested objects and we’ll be able to extract the parent object name.

SELECT
   a.event_time
  ,aa.name AS 'action_name'
  ,c.securable_class_desc AS 'securable_class'
  ,c.class_type_desc AS 'class_type'
  ,a.statement
  ,CONVERT(XML,a.additional_information).value('(/tsql_stack/frame/@database_name)[1]','varchar(100)')
  + '.' + CONVERT(XML,a.additional_information).value('(/tsql_stack/frame/@schema_name)[1]','varchar(100)')
  + '.' + CONVERT(XML,a.additional_information).value('(/tsql_stack/frame/@object_name)[1]','varchar(100)') AS parent_object_name
FROM sys.fn_get_audit_file ('https://mcplabv2storage.blob.core.windows.net/sqldbauditlogs/imperialwalker/Lahman2015/SqlDbAuditing_Audit_NoRetention/2017-04-17/17_32_29_811_0.xel',default,default) a
LEFT JOIN dbo.audit_actions aa ON a.action_id = aa.action_id
LEFT JOIN dbo.audit_class_types c ON a.class_type = c.class_type;
GO


As you can see, Blob Auditing for Azure SQL Database provides us with major improvements over Table Auditing, and gives us the flexibility and granular control that we are used to in the box product.

For more information on the auditing features for Azure SQL Database, follow these links.
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-auditing
https://docs.microsoft.com/en-us/powershell/module/azurerm.sql/set-azurermsqldatabaseauditingpolicy
https://msdn.microsoft.com/library/azure/mt695939.aspx

Apr 132017
 

When working with the AzureRM PowerShell module, remember the module is constantly being updated to take advantage of new features added to Azure. To check the version of the AzureRM module currently installed on your system, use the following command.

Get-Module AzureRM -List | Select-Object Name, Version, Path

The screenshot below shows I am running version 1.4.0 and the output of Get-AzureRmSqlDatabaseAuditingPolicy command-let.

Next, I install the latest AzureRM module.

After the install is complete, I rerun the same commands. The screenshot below shows that I am now running version 3.8.0 followed by the same Get-AzureRmSqlDatabaseAuditingPolicy command-let.

Notice the difference in the output? There are several differences, but in the newer version there properties for AuditAction and AuditActionGroup which were just recently added to Azure.

For more information on installing the AzureRM module, as well as running multiple versions side-by-side, follow these links.
https://docs.microsoft.com/en-us/powershell/azure/install-azurerm-ps
https://github.com/Azure/azure-powershell/releases

Sep 032013
 

As a DBA, we’re often asked to troubleshoot performance issues for stored procedures.  One of the most common tools at our disposal is the query execution plan cached in memory by SQL Server. Once we have the query plan, we can dissect what SQL Server is doing and hopefully find some places for improvement.

Grabbing the actual XML query plan for a stored procedure from the cache is fairly easy using the following query.

USE AdventureWorks2012;
GO
SELECT qp.query_plan FROM sys.dm_exec_procedure_stats ps
    JOIN sys.objects o ON ps.object_id = o.object_id
    JOIN sys.schemas s ON o.schema_id = s.schema_id
    CROSS APPLY sys.dm_exec_query_plan(ps.plan_handle) qp
WHERE ps.database_id = DB_ID()
    AND s.name = 'dbo'
    AND o.name = 'usp_MonsterStoredProcedure';
GO

From this point, we can open the XML query plan in Management Studio or Plan Explorer to start our investigation. But what happens if SQL Server returns NULL for the query plan?
Let’s back up a little bit.  We were pretty sure the query plan is still in cache, right?  Let’s verify it.

USE AdventureWorks2012;
GO
SELECT * FROM sys.dm_exec_procedure_stats ps
    JOIN sys.objects o on ps.object_id = o.object_id
WHERE o.name = 'usp_MonsterStoredProcedure';
GO

Sure enough.  The query plan is still cached in memory, and we even can even see the plan_handle.
So why did our first query not return the XML plan?  Let’s copy the plan_handle and manually run it through the sys.dm_exec_query_plan function.

SELECT * FROM sys.dm_exec_query_plan(0x05000500DD93100430BFF0750100000001000000000000000000000000000000000000000000000000000000);
GO
Why are we getting NULL returned for the XML query plan when we know is in the cache?  In this case, because the query plan is so large and complex, we’re hitting an XML limitation within SQL Server.  “XML datatype instance has too many levels of nested nodes. Maximum allowed depth is 128 levels“.  
Let’s try to pull the text version of query plan.
SELECT * FROM sys.dm_exec_text_query_plan(0x05000500DD93100430BFF0750100000001000000000000000000000000000000000000000000000000000000,DEFAULT,DEFAULT);
GO

It looks as though we have solved the issue; however, we didn’t.  Management Studio has a 65535 character limit in grid results and 8192 character limit in text results.  Our query plan has been truncated far from the end.  Now it seems we are back to square one.  
We still know the query plan is in cache, but we just need a tool other than Management Studio to retrieve it.  This is where Powershell enters the picture.
With Powershell, we can create a simple script to execute the sys.dm_exec_text_query_plan function and then output the data to a file.  All we need is to pass two variables.  The first is the SQL Server name where the plan is cached, and the second is the plan_handle. 

param (    
    [Parameter(Mandatory=$true)]
    [ValidateNotNullOrEmpty()]
    [string]
    $SqlInstance

   ,[Parameter(Mandatory=$true)]
    [ValidateNotNullOrEmpty()]
    [string]
    $PlanHandle
)

The final step will use System.IO.StreamWriter() to output the data to a file.
$stream = New-Object System.IO.StreamWriter($FileName)
$stream.WriteLine($QueryPlanText)

The Powershell script will save the entire XML query plan to a file named output.sqlplan.  As you can see below, the actual plan was over 5MB.
Finally we’re able to view the entire query plan in our favorite tool and see the complexity of the stored procedure.
This is just another example of why  DBAs need to set aside some time to learn Powershell.  The entire script is posted below.  Feel free to modify it as needed to fit your environment.

######################################################################################
#
#   File Name:    Get-QueryPlan.ps1
#
#   Applies to:   SQL Server 2008
#                 SQL Server 2008 R2
#                 SQL Server 2012
#
#   Purpose:      Used to retrieve an XML query plan from cache.
#
#   Prerequisite: Powershell must be installed.
#                 SQL Server components must be installed.
#
#   Parameters:   [string]$SqlInstance - SQL Server name (Ex: SERVER\INSTANCE)
#                 [string]$PlanHandle - Binary query handle
#
#   Author:       Patrick Keisler
#
#   Version:      1.0.0
#
#   Date:         08/30/2013
#
#   Help:         http://www.patrickkeisler.com/
#
######################################################################################

#Define input parameters
param ( 
  [Parameter(Mandatory=$true)]
  [ValidateNotNullOrEmpty()]
  [string]
  $SqlInstance
  
  ,[Parameter(Mandatory=$true)]
  [ValidateNotNullOrEmpty()]
  [string]
  $PlanHandle
  )

Write-Host "Script starting."

#Grab the path where the Powershell script was executed from.
$path = Split-Path $MyInvocation.MyCommand.Path

#Build the SQL Server connection objects
$conn = New-Object System.Data.SqlClient.SqlConnection
$builder = New-Object System.Data.SqlClient.SqlConnectionStringBuilder
$cmd = New-Object System.Data.SqlClient.SqlCommand

#Build the TSQL statement & connection string
$SqlCommand = "SELECT query_plan FROM sys.dm_exec_text_query_plan(" + $PlanHandle + ",DEFAULT,DEFAULT);"
$builder.psBase.DataSource = $SqlInstance
$builder.psBase.InitialCatalog = "master"
$builder.psBase.IntegratedSecurity = $true
$builder.psBase.ApplicationName = "Get-QueryPlan"
$builder.psBase.Pooling = $true
$builder.psBase.ConnectTimeout = 15
$conn.ConnectionString = $builder.ConnectionString
$cmd.Connection = $conn
$cmd.CommandText = $SqlCommand

try
{
 if ($conn.State -eq "Closed")
 {
  #Open a connection to SQL Server
  $conn.Open()
 }
 
 #Execute the TSQL statement
 [string]$QueryPlanText = $cmd.ExecuteScalar()

 #Write the output to a file
 $FileName = $path + "\output.sqlplan"
 $stream = New-Object System.IO.StreamWriter($FileName)
 $stream.WriteLine($QueryPlanText)

 if ($stream.BaseStream -ne $null)
 {
  #Close the stream object
  $stream.close()
 }

 if ($conn.State -eq "Open")
 {
  #Close the SQL Server connection
  $conn.Close()
 }
 
 Write-Host "Script completed successfully."
}
catch
{
 #Capture errors if needed
 if ($_.Exception.InnerException)
 {
  $Host.UI.WriteErrorLine("ERROR: " + $_.Exception.InnerException.Message)
  if ($_.Exception.InnerException.InnerException)
  {
   $Host.UI.WriteErrorLine("ERROR: " + $_.Exception.InnerException.InnerException.Message)
  }
 }
 else
 {
  $Host.UI.WriteErrorLine("ERROR: " + $_.Exception.Message)
 }
 
 Write-Host .
 Write-Host "ERROR: Script failed."
}