Monday, October 7, 2024

Azure SQL and MI - scripting out a database or just some objects, and putting it in GIT.

I need this. You need this. We all need this. Source control all the things.



 <#

.SYNOPSIS

  Will script an object from SQL Server and CHECKIN/ADD to git.

.EXAMPLE 

  sql_script_to_git.ps1 -server yourservername -Database yourdatabasename -ScriptType "FULL" -Author yourgitname -Comment "full checkin of database"


#>

# Purpose - given parameters, script out an object from a SQL Server, using SMO, then check it into git.

#Scripting code gleefully copied from Phil Factor - the great parts that work are his, the errors are mine

# https://www.simple-talk.com/sql/database-administration/automated-script-generation-with-powershell-and-smo/

#Other code from others copied and attributed below.


#mdb 2013/12/03 adding ScriptType so that we can add "create database"-specific changes ONLY. 

#     To do FULLs you will have to have  -ScriptType = 'Full' parameter. By default it will do objects.

#mdb 2013/12/05 trying to better bullet-proof the app against TF.exe failures, which appear to occur randomly

#mdb 2013/12/07 works great, though occasionally it cant script out an item.  Trying to bulletproof that

#mdb 2013/12/16 broke the individual-item portion, fixed. 

#mdb 2024/10/03 we're back, baby! Modifying to handle the sqlauth needed, the keyvault, and git instead of TFS.


#param has to be the first line of the script.

param( 

        [Parameter(Mandatory=$true,Position=1)]

         [string]$Server, 

        [Parameter(Mandatory=$true,Position=2)]

         [string]$Database, 

        [string]$ScriptType ='Object',

        [string]$SchemaToScript, 

        [string]$ObjectToScript, 

        [string]$Author = 'powershellautomation0104', 

        [string]$Comment

      )


        #make sure not ObjectToScript blank if scripting out the entire DB; makes checkin better


if ($null -eq $SchemaToScript) {$SchemaToScript = ''}

if ($null -eq $ObjectToScript) {$ObjectToScript = ''}



#these are the parameters for testing

#$Server='myservername'# the server it is on

#$Database='mydbname' # the name of the database you want to script as objects

#$SchemaToScript = ''

#$ObjectToScript = ''

#$Author = 'mynameandemail'

#$ScriptType ='Full'

#$Comment = 'author test'


#setting up an error code for later

$myerror = 0


#getting the password from keyvault.

Connect-AzAccount -Tenant tenantnamegoeshere -Identity -AccountId acctnamehjere -Subscription subnamehere

$sqlusername = "sqlloginhere"

$sqlpassword = Get-AzKeyVaultSecret -VaultName "keyvaulthere" -Name $sqlusername -AsPlainText


if ($comment -eq '')

{ $comment = "No Comment Given"}


if ($author -eq '')

{ $author = "authorhere"} 

#field is mandatory, if we dont know, use a known-valid.


$ServerNameClean = "$(($server -split "\.")[0] -replace '[\\\/]','__')"

clear 


#writing this out for logging and troubleshooting. These are all the parameters except ScriptType

write-host $Server, $Database, $SchemaToScript, $ObjectToScript, $Author, $Comment


#Git workspace folder - whatever you set it up as on your server

$DirectoryToSaveTo='c:\repos\DatabaseSchema' # the directory where you want to store them, within the git repo


#doing a pull prior to checking in, due to "error: failed to push some refs to"

set-location $DirectoryToSaveTo


#handling crappy git responses as per https://github.com/dahlbyk/posh-git/issues/109

$env:GIT_REDIRECT_STDERR = '2>&1'

& 'C:\Program Files\git\bin\git.exe' pull origin master




# Load SMO assembly, and if we are running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries

$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')

if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {

    [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null

}

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SmoEnum') | out-null

set-psdebug -strict # catch a few extra bugs

$ErrorActionPreference = "stop"

$My='Microsoft.SqlServer.Management.Smo'

$srv = new-object ("$My.Server") $Server # attach to the server


$conContext = $srv.ConnectionContext

$conContext.LoginSecure = $false

$conContext.Login = $sqlusername

$conContext.Password = $sqlpassword




if ($srv.ServerType-eq $null) # if it managed to find a server

   {

   Write-Error "Sorry, but I could not find Server '$Server' "

   return

}


$scripter = new-object ("$My.Scripter") $srv # create the scripter

#Add the various options we care about

$scripter.Options.ToFileOnly = $true   

$scripter.Options.ExtendedProperties= $true # yes, we want these

$scripter.Options.DRIAll= $true # and all the constraints

$scripter.Options.Indexes= $true # Yup, these would be nice

$scripter.Options.Triggers= $true # This should be includede

$scripter.Options.AppendToFile = $False

$scripter.Options.AllowSystemObjects = $False

$scripter.Options.ClusteredIndexes = $True

$scripter.Options.DriAll = $True

$scripter.Options.ScriptDrops = $False

$scripter.Options.IncludeHeaders = $False #so you do not get the one line "scripted at..." which would be NOW.

#$scripter.Options.ToFileOnly = $True

#$scripter.Options.Indexes = $True

$scripter.Options.Permissions = $True

$scripter.Options.WithDependencies = $False

$scripter.Options.Bindings = $true

$scripter.Options.IncludeDatabaseRoleMemberships = $true

#phil factor does a bit where he checks it in separately, but i'm doing it this way. The only downside is that SMO won't script out database-level properties.


#################################################

#First, script out the database "create" itself.# 

#################################################

if (($Database) -and $ObjectToScript -eq '')  #if database has a value but there is no object, script out the DB.

{

$db_scripter = new-object ("$My.Scripter") $srv # script out the database creation

$db_scripter.options=$scripter.options # with the same options

$db_scripter.options.filename="$($DirectoryToSaveTo)\$($ServerNameClean)\$($Database)\$($Database)_database_create.sql" # with the same options


#explcitly creating the path for the DB script here. We still need to do it for all the sub-types.

# Could probably move the $d declaration up here, but leaving it here for readability.

$SavePath="$($DirectoryToSaveTo)\$($ServerNameClean)\$($Database)"

# create the directory if necessary (SMO does not).

if (!( Test-Path -path $SavePath )) # create it if not existing

     {Try { New-Item $SavePath -type directory | out-null }

     Catch [system.exception]{

         Write-Error "error while creating '$SavePath' $_"

         return

      }

 }



#putting in a try/catch so we get error messages if it breaks, and it can continue.

try

    {

    $db_scripter.Script($srv.Databases[$Database]) # do it        

    }

Catch

    {  

    "Error Message trying to script out $SavePath\$Filename"

    $error[0].Exception

    }




###########################

## Scripting out Objects ##

###########################

# we now get all the object types except extended stored procedures


# first we get the bitmap of all the object types we want

# and we had to exclude a bunch in order to support Azure SQL Database/MI.

# tested by running stuff like this against every option in: https://learn.microsoft.com/en-us/dotnet/api/microsoft.sqlserver.management.smo.databaseobjecttypes?view=sql-smo-160 (SMO DatabaseObjectTypes Enum)

# $srv.databases[$Database].EnumObjects( [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::DatabaseEncryptionKey )

# -bxor in this case means "don't use these!". They are non-scriptable via Azure SQL DB's SMO, which makes sense since it can't include them. Unsure why SMO can't do DatabaseAuditSpecification, though.


$all =[long] [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::All `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::DatabaseEncryptionKey          `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::DatabaseAuditSpecification     `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::RemoteServiceBinding           `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::SearchPropertyList             `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ServiceBroker                  `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ServiceContract                `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ServiceQueue                   `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ServiceRoute                   `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::MessageType                    `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::Certificate                    `

-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ExtendedStoredProcedure #we're ignoring this as per the original version, don't remember why.


# and we store them in a datatable

$d = new-object System.Data.Datatable






# get almost everything; skipping most service broker, information_schema, system_views, certificates (cannot be scripted)

#  there are other items that may need to be skipped, like SymmetricKeys in SSISDB

# Yes, I realize the irony in skipping SB given that it is powering this.  


        #putting in a try/catch so we get error messages if it breaks, and it can continue.

        #and note that we're excluding stuff that's above, but also other stuff that does enumerate but we don't want. 

        try

            {

            $d=$srv.databases[$Database].EnumObjects( $all) | `

                Where-Object {$_.Schema -ne 'sys'-and $_.Schema -ne "information_schema" -and $_.Schema -ne "INFORMATION_SCHEMA" -and $_.DatabaseObjectTypes -ne 'ServiceBroker' `

                -and $_.DatabaseObjectTypes -ne 'Certificate' `

                -and $_.DatabaseObjectTypes -ne 'MessageType' `

                -and $_.DatabaseObjectTypes -ne 'ServiceContract' `

                -and $_.DatabaseObjectTypes -ne 'ServiceQueue' `

                -and $_.DatabaseObjectTypes -ne 'ServiceRoute' `

                -and ($SchemaToScript -eq '' -or $_.Schema -eq $SchemaToScript) `

                -and (($ObjectToScript -eq '' -and $ScriptType -eq 'Full') -or $_.Name -eq $ObjectToScript) `

                -and (!(($_.DatabaseObjectTypes -eq 'Schema' -or $_.DatabaseObjectTypes -eq 'DatabaseRole') -and $_.Name  -match "db(_)*" -and $_.Name -ne 'dbo')) `

                -and ($_.Name -notmatch 'sp_MS*') }

            # mdb 2013/11/07 previous line skips replication objects.  This comment below code as comment lines break extended 1-liner.

            # mdb 2024/10/03 previous line to that is because it would script out the db_datareader etc roles.

            }

        Catch

            {  

            "Error Message trying to enumerate the database - may be logshipped or being restored"

            $myerror = 1

            $error[0].Exception

            write-host $error

            }



# List every item that we are going to do

$d = $d | sort

# $d | select databaseobjecttypes, schema, name

 

if ($d.Count -gt 10000)

{

"skipping the database objects - more than 10000"

}


# Now write out each scriptable object as a file in the directory you specify

#it appears that an empty array never actually enters the FOREACH, leaving variables unset

# -and -$d.Count -ne 0

if ($myerror -eq 0 -and $d.Count -lt 10001) #20k of objects takes up 5gb of RAM in the PS script and causes problems

{

$d| FOREACH-OBJECT { # for every object we have in the datatable.

   #"" #blank line so each block of error messages is separated out

   $SavePath="$($DirectoryToSaveTo)\$ServerNameClean\$Database\$($_.DatabaseObjectTypes)"


   # create the directory if necessary (SMO does not).

   if (!( Test-Path -path $SavePath )) # create it if not existing

        {Try { New-Item $SavePath -type directory | out-null }

        Catch [system.exception]{

            Write-Error "error while creating '$SavePath' $_"

            return

         }

    }

    # tell the scripter object where to write it, and make sure it is actually writeable

    if ($_.schema)

    {

    $Filename = "$($_.schema -replace '[\\\/\:\.]','-').$($_.name -replace '[\\\/\:\.\ ]','-').sql";

    }

    else

    {

    $Filename = "$($_.name -replace '[\\\/\:\.]','-').sql";

    }


    $scripter.Options.FileName = "$SavePath\$Filename"

    $scripter.Options.FileName #print it out so we know what is being done


    $UrnCollection = new-object ('Microsoft.SqlServer.Management.Smo.urnCollection')

    $URNCollection.add($_.urn)



    try

    {

    $scripter.script($URNCollection)

    }

    Catch

    {  

    "Error Message trying to script out $SavePath\$Filename"

    $error[0].Exception

    }


    #"script done"



}

}

}



<# GIT checkin - somewhat easier than before #>


Write-Host "Checking in of Files to Git "


Set-Location "$DirectoryToSaveTo"


#force commit and push all files

#git add .

#git commit -m "$($Comment)" --author="$($Author)"

#git push -f --porcelain #the two switches don't work together.

#git push --porcelain


#doing it this way since the container(?) account doesn't have the path for git.

#and this is so incredibly finicky. can't use -m and -c, have to have values set for -c since it's a network service account, etc.

#you need to run, as a job step, 

#do NOT get clever with the path. putting it in quotes so you can say program files causes failures.Maybe.

& 'C:\Program Files\git\bin\git.exe' add .

& 'C:\Program Files\git\bin\git.exe' commit -m "$($Comment)" --author="$($Author)"

& 'C:\Program Files\git\bin\git.exe' push --porcelain

# -m "$($Comment)" 


#go back to script directory

Set-Location $PSScriptRoot


Azure MI - Failed email -2147467259

A job step on an Azure Managed Instance that calls msdb.dbo.sp_send_dbmail (or possibly calling a linked server) can fail with error -2147467259.


The fix is to put "EXECUTE AS LOGIN = 'SQLAGENTUSER' " at the front of the code inside the job step. What's happening is that the Login that's used by the agent (ContainerAdministrator) is lacking perms (MSDB for email, but also linked server).

Thursday, June 6, 2024

Applying Query Store hints to fix cardinality estimation issues where you can't change the query.


This morning I had a performance issue on a piece of code that worked long ago on a different server, and they were trying to put it in place today.  It was SLOW. Like, 10 minutes slow. With the added bonus that it's done through a web app, so it never finishes, it just always times out. After dealing with various approaches, I finally tried using the old Cardinality Estimator, and it went from 10 minutes to 3 seconds. But the query is inside the application, it doesn't call a stored procedure. Which means the devs changing it is Non-Trivial. So I went to an updated version of an old trick - query store hints (which used to be Plan Guides)

In theory you can probably get enough info from the query store/plan cache, so you can find the exact query  


What I did: 

  • ran a profiler trace for that user/database. You could do this in Azure Data Studio for Azure SQL Database - I got "lazy", and since it was on a Managed Instance, used Profiler. Don't judge me. :)
  • found the exact statement that their code ran. Easiest way to get this (it needs to be EXACT down to the white space) is to right-click on the cell in profiler, extract events, save to a file, then copy/paste out.
  • found the AUDIT LOGIN from when it ran
  • launched SSMS
  • in one window, posted all of the SET QUOTED_IDENTIFIER (etc) statements from the AUDIT LOGIN, newline, "go", newline, then copy/paste from that second step. NO trailing anything. Ran it. Waited the 10 minutes for it to finish. You need to do it this way because the application default settings are not the same as the SSMS defaults (for more info, read Erland's seminal work "Slow in the Application, fast in SSMS?" https://www.sommarskog.se/query-plan-mysteries.html#defaultsettings), specifically 2.5.
  • Expand that database in SSMS.
  • Query Store->Top Resource Consuming Queries
  • Mouse over and find the query in the left-side. Get the query_id (not the plan id). For me, it was 9605.
  • run this in that database: EXEC sys.sp_query_store_set_hints @query_id= 9605, @query_hints = N'OPTION(USE HINT(''FORCE_LEGACY_CARDINALITY_ESTIMATION''))';
  • have people test it and make sure it works.
  • later, go into Query Store->Tracked Queries, put in the number above (for me, 9605), and see if it's getting called, and how long it takes.  Oddly, it doesn't show up in "Queries with forced plans".  : ( Unsure how to find queries that we've tuned this way, maybe Managing Query Store Hints - Simple Talk (red-gate.com)  ?).

Friday, May 10, 2024

ODBC OLEDB - returning the version of the actual driver using powershell.

 I have no idea why this is that difficult. The get-odbcdriver cmdlet is overly complicated to read, and doesn't return the actual DRIVER VERSION. The "DriverODBCVer", I believe, is the version of ODBC that's supported.

odbc driver file versions (github.com)

<script src="https://gist.github.com/mbourgon/05c75cda141745e4244bbe876cba5c10.js"></script>