Thursday, May 30, 2013

[SSIS] Finding your connection strings in SQL Server 2008/R2

"Bad artists imitate - great artists steal" Picasso Banksy

I initially wasn't sure if this was worth posting - if you have the same problem, you've probably come across the same links.  But since I don't see a soup-to-nuts version, here goes.

Problem: you're migrating servers, and need to see what SSIS packages might be affected, so you need the connection strings for all of your SSIS packages.  You could open up each package and check the connections - but that's... inefficient.

If you're storing the packages on disk, you could write a short (power)shell script that would scan the SSIS package files.  But if it's in MSDB, you'd need to pull the package XML out of the system tables, then parse the XML.  This is for 2008/R2 (and if you're not using SSISDB in 2012).  In 2012, SSISDB holds the files differently, and I'm awaiting an answer from one of the MS people on the SSIS team (thanks to Jamie Thomson‎ for forwarding my tweet to him)

Code cribbed from somebody, possibly Feodor Georgiev  (I found two links, below)

and the XML parse was a question I asked on StackOverflow.  Thanks to Davmos for the help!

Wednesday, May 22, 2013

[Powershell] Grabbing and parsing the RSS feed for new SQL Server Releases

(update 2013/05/23 fixed the date format issue; uses smalldatetime now)

My boss asked me to scrape a web site for release/security info for SQL Server.  Okay.

First attempt required a stop at stackoverflow, where I was reminded about hashtables (Thanks again to Grainer!)  And after several hours, I had found the correct Microsoft blog for security updates and had this:

$posts= @() # this basically resets the posts for multiple runs - broke my ISE and good before

[xml]$rss = Invoke-WebRequest

$rss.SelectNodes('//item') | % {
    $posts += New-Object psobject -Property @{
        Title = $_.Title.Trim()
        Thedate = [datetime]$
        Guid = $_.Guid.Trim()
        Content = $_.Encoded."#cdata-section".Trim()

$posts | select-object # | where-object {$_.Title -like '*sql*' -or $_.Guid -like '*sql*'}
#and then insert into a table

Awesome!  I feel good - mission accomplished.  Worked on it at home (yes, nerd here, big shock).
Then the boss looks and says "no, I didn't want that, I wanted the patches and CUs".

Crap.  So let's go find a different blog.  Oh yeah, I have one specifically for SP/CUs!  (In google reader, naturally...RIP.  OT: go to Newsblur and CommaFeed!)

Fortunately, this blog is much easier to parse, so I don't need the hashtable.  Well, not a massive hashtable.  The second one below is specifically to rename the "Description" field so that I don't have a field that's a semi-reserved word. The first one below is to reformat it as a datetime.  Use Chad Miller's awesome scripts to turn it into a datatable and write to disk, and voila!  (Alternatively, you should be able to use my Extensible Powershell Repository, elsewhere on this blog, and just have a 2-line PS script that is invoked via the EPR.

Now, I just have to use that with a MERGE statement to add only the new records, and email me when a new record comes in. 

# CREATE TABLE SQL_Server_Releases_from_RSS
# title varchar(200),
# link varchar(500),
# pubdate smalldatetime,
# descript varchar(MAX) --I really don't want a field named "description"
# Scripts from
. C:\sql_tools\out-datatable.ps1  #Chad Miller rocks!
. C:\sql_tools\write-datatable.ps1
. C:\sql_tools\add-sqltable.ps1

[xml]$rss = Invoke-WebRequest

$datatable = $ | select title, link, @{Name="pubdate";Expression={get-date ($_.pubdate) -format "yyyy/MM/dd hh:mm"}}, @{Name="descript";Expression={$_."Description"}} |out-datatable

write-datatable -ServerInstance "sql_repository" -Database myrepository -TableName SQL_Server_Releases_from_RSS -data $datatable

Thursday, May 16, 2013

[Extended Events] list of predicates that you can use to filter actions

I've been using XE lately to get a list of servers that connect to a server I'm going to upgrade (see  That way I know what the downstream effects are beforehand.

However, these are busy servers - even with connection pooling, we can get hundreds of connections a minute. Which means even minimal logging will get big, fast.

So there are two ways to prevent that:
  1. Predicates to filter before it hits the target
  2. Using a histogram to store in buckets.
However, a histogram can only store one value and a count.  I haven't figured a way to combine the fields (and according to Jonathan Kehayias you can't), so I requested Microsoft add a multi-field histogram (which got closed as Won't Fix:

So, back to  #1, and looking at Predicates. 

Fortunately, I found this blog post which has a lot of details about it:
(yup, a 2-year-old post that's now incredibly relevant)


SELECT name, description,
    (SELECT name FROM sys.dm_xe_packages WHERE guid = o.package_guid) package
FROM sys.dm_xe_objects o
WHERE object_type = 'pred_compare'

Which lead me to:

From the blog post:
"Calling a pred_compare directly follows this syntax:
package_name.pred_compare(field/pred_source, value)
[...]ADD EVENT wait_info (WHERE package0.greater_than_max_int64(duration, 500)

Okay, I can do that.


which gives me everything from our laptops.  Close.  Can I do a NOT LIKE?

              (NOT sqlserver.like_i_sql_unicode_string(sqlserver.client_hostname,N'laptop%'))

Which, yes, works.  
So my query to look for logins not from our laptops, and excluding certain servers, looks like...

ADD EVENT sqlserver.login(
WHERE ([sqlserver].client_hostname NOT LIKE '%-SERV-%')
              AND (NOT sqlserver.like_i_sql_unicode_string(sqlserver.client_hostname,N'laptop%'))
ADD TARGET package0.ring_buffer(SET max_memory = 4096)

And, typing this in... I see I'm already excluding using a LIKE.  Why'd I go through all this effort again?  Um, knowledge, I guess?  : )

Thursday, May 2, 2013

[SSIS 2012] Using SQL Sentry & custom code to send detailed email on failure

(update 2013/07/31 dealing with "cancelled")

This is a simple modification of, which sends an email with specific job failure info, rather than the nondescript "To view the details for the execution, right-click on the Integration Services Catalog, and open the [All Executions] report ", which is pretty freaking useless.

We use SQL Sentry, which makes this part simple (they have their own method of logging SSIS failures, but it has several prereqs I don't like - specifically, it changes the step from SSIS to CmdExec, and executes the parameter-laden string).  My original post/SP required you to add a job step to each job - this one gets set on the server.  Just add an SQL Sentry action ("SQL Server Job Agent: Failure") and add "Execute SQL", running the following code, and it takes care of every job on the server. 

exec yourdbname.dbo.ssis_job_failure_info_fromjobname @job_name = '<%ObjectName%>'
(where yourdbname is wherever you store the SP)

As a bonus, I use the SQL Sentry email SP, so it appears to come from SQL Sentry.  That could be useful for you, based on your filtering.

One note: if the job fails because of an SSIS timeout (see:, no email is sent. That's because, while the job failed, technically the package doesn't know it failed... or something like that.  Read the Connect item for more details.  Supposedly fixed in SP1, but haven't had it happen up to now. but if it happens to you, there are a handful of fixes available.

  1. Add the 5 indexes as per
  2. Change Retention period as per: 
    1. Don't just set it to 90 if you have a lot of history! You need to iterate through days, running the maint job between each.
  3. Change DB size settings as per same post
  4. Ensure Database is in SIMPLE mode.
  5. (possibly turn on snapshot isolation)
  6. (I also turn on Async Statistics Update) 
  7. - There is a fix from MS!   SP1 CU4 has a fix.  It says it just adds indexes; it must change the delete SP, since adding the indexes did NOT fix it for me.

And here's the modified SP: