Friday, December 12, 2014

[Presenting] What do I present on next?

Title says it all.  At this point I've pretty much hit every SQLSaturday that's easily reachable from here.  I've had a ball doing it, and the coworkers I drag along with me have had a lot of fun as well.

But I've presented on that topic (Event Notifications) at them.  So what's next?

I need to come up with a new session.  I can obviously just GO (maybe as a volunteer), but I think I have more fun when I'm there to present.  I wouldn't have thought so when I first started doing it, but it's the case and I'm cool with it. 

I just don't know what to present on.  Mark V, in his session, said that any topic is a good topic - everyone brings their experiences to it, and everyone explains it differently.  That being said, there are people who (in my mind) own their topic - they're of the MVP/MCM level (which I'm not in one case, and can never be in the other case), and their topic is freaking fantastic.

So, I want to come up with something useful to many people, and preferably something not often covered. 

Here are some of the thoughts that have been going through my head:

  • The $25000 (per core) Pyramid - what enterprise features don't you use?
aka 
  • ($25000 per) CORE - What is it good for?
Would probably be a relatively high-level session.  Cover Peer-to-Peer replication, Hekaton, Columnstore/Clustered Columnstore, things-that-used-to-be-Enterprise-but-are-now-standard,  AO, online restores, online indexes, mirrored backups, partitioning, compression, distributed replay, CDC etc.  Could even do non-Engine features like Tabular Model (BISM) and DQS.

---------------------------------


  • Save all the things!  
A session about saving out various things for later, and parsing them for info.  I've got a solution built in Powershell called the EPR that's saved my bacon a bunch (using it right now to figure out when code was last run on all my servers), but could also cover things like pulling out connection strings from SSIS packages, etc.  But I could also expand parts into using Powershell, using SMO, etc.

---------------------------------
  • SP_Server_Diagnostics for fun and profit
This would be a decent amount of work - while I have code that will shred out the results, there's a LOT more to it than that.  And I could expand and cover system_health.  The one problem is that Kehayias had a decent session about it at PASS 2013.


---------------------------------


Aaaaaaand.... that's all I have right now.  Any thoughts gratefully appreciated.

Tuesday, December 9, 2014

[Presenting] SQLSatDC (SQL Saturday #347) was a blast!






A gigantic thank you to the whole team at SQLSaturday347, aka SQLSatDC 2014.

I had an absolute blast.  Rubbed shoulders with a bunch of people I know of, chatted with some SQLSat friends, met a bunch of new people I want to see more of, and even managed to take in the sights (and some tapas!).

Let's see, what did I see/do?

Thursday/Friday - tourista!

Friday night - speaker's dinner.  A nice little Spanish/LatinAmerican restaurant.  After a bit of "how do we order", managed to get everything straightened out, and had a blast.  Tapas, Paella, lots of SQL and non-SQL conversations, etc.

Saturday morning - going by the vendor tables, some groups I haven't seen before.   Always nice to see.  Wow, a Linchpin People table! (First time I've seen it, though I've run across at least 5 of them at different events, and they've been uniformly amazing folk)

 8:15am - me!  Standing room only, which as a presenter is honestly pretty cool.  Some good questions and comments, which means I need to incorporate them as well as strip out other things.  At this point I've gotten my presentation up to right at 70 minutes, sans breathing, so it's time to strip out some of the less-important slides.


9:40 - David Dye, Dealing with Errors in SSIS 2012.  Some interesting stuff from a SSIS developer perspective, how to save errors out, prevention, dealing with them, etc, etc.  Less interesting for me as a DBA, though if my devs did those fixes now I'd know where to look.  As is, I have to keep using my own email notifier for SSIS Project Deployment Model failures.


11:05 - Jason Brimhall, Murder They Wrote. It's the one-hour summary of the day-long summarization of a week long training session.  *Whew*!  Very nicely done - while I knew many of them, he did a great job explaining them.  The only odd part was that very few people seemed to know what Clue was(!).  And anybody who can make a "boot to the head" reference (Yay Frantics!) is a hero to me. 

Lunch: Fajitas!  And a head shot!  (Thanks, Chris!) And chatting with one of the attendees for a bit, talking about EN, CI, and other two-letter acronyms. And ended up by getting pinned down by one of the redgate folk and looking at their SQLLitehouse tool, which does a lot of what EN (and therefore my presentation) does.  Wonder if I can steal any ideas?  Is that legit, if I'm doing this for fun?

1:30 - Jimmy May, Columnstore, Flipping the Faster Bit.  Apparently I missed the part where he did a handstand, but I'll tell you what, he was the most animated speaker I've seen in a while.  Very knowledgeable, some useful information (and I'm running CS/CCS!), and I got to pick his brain after.  My only suggestion (and I made this to him already) was that if we're in the session, we don't need as much "marketerization" slides (but he knew that already). But it did explain some of the tuple mover issues I've seen, so... WIN!

2:55 - Samuel Vanga, SSIS Data Flow under the hood.  Something I haven't seen much of, specifically a closer-to-the-metal session about tuning SSIS, almost from a DBA perspective.  I swear I've seen some of it before (Tim Mitchell, probably), but it's nice to see more internals covered.  Normally it's very much a "oh, everything works", but this was more about low-level tuning, async vs non-async, spilling to disk, etc.

Anyhow, I'll definitely try and come back - they had a great turnout with some fantastic speakers!

Tuesday, December 2, 2014

[Presenting] SQLSatDC, aka SQLSat347, this weekend (Dec 6 2014)!

Come see me speak in Washington DC!  (Okay, fine, and another 29 speakers like Grant Fritchey, Allan White, Steve Jones, Jessica Moss, Randy Knight, Jason Brimhall, Jimmy May...).

There's even a pre-con on Friday - day-long training on BIMLScript (different venue, though, I think).

SQL Saturday DC 2014.
A mere $10 gets you in - and that includes lunch!
Doors at 7:30am, starts at 8:15am.
https://www.sqlsaturday.com/347/



Microsoft MTC, 5404 Wisconsin Ave, Chevy Chase, MD, 20815


[MSMQ] Using powershell to PEEK at several thousand records

A bit of a detour today - delving into stuff not easily done with T-SQL.

I needed to look in some MSMQ queues, without removing the messages, pulling out some details.  After multiple attempts (and a couple questions on StackOverflow) I got exactly what I was looking for. 

Basically:
  • PEEK 500 messages off a queue (PEEK differs from SELECT because it leaves them in the queue - for you T-SQL types, the difference between SELECT and RECEIVE in Service Broker)
  • Pull out the (binary) bodystream 
  • Convert bodystream to ascii
  • Strip off the non-XML header
  • Get the ID from the message (both inside and before the XML is included below)
  • Return it all to a grid.  
Not elegant, but works pretty well (and pretty fast for my limited set - 10 seconds for 500)

 Here's what my header looks like, FWIW, since it drives several parts of the code (the split around the colon, for instance)
F:01.02.03:1234567890:Message:clear
#MSMQ.bodystream is in decimal format, so this is our converter.
$enc = [System.Text.Encoding]::ASCII

#get-msmqqueue finds that one queue (it accepts wildcards).  
$msmq = Get-MsmqQueue -Name "my_msmq" `

#now that we have the queue as an object, PEEK at the first 500 messages.  
#Give up after 5 seconds (since it will hang or crash if there are none)
#We assign the end results to a variable so that we can return in one set.  

#If writing to a file, putting the write inside the loop might speed things up, would def save memory usage
 

$var = Receive-MsmqQueue -input $msmq -Peek -Count 500 -Timeout 5000 -RetrieveBody | foreach {

#BodyStream is an object of decimals that we need to convert to an array, then a string.

#Was originally, but in this case I can combine
# $sr = $_.BodyStream.ToArray()
# $pr = $enc.GetString($sr)


$pr = $enc.GetString($_.BodyStream.ToArray())

#Clean is an XML version of the object, minus the crappy pre-xml header  

#MANY thanks to Matt & Swonkie for the speed boost.
#http://stackoverflow.com/questions/26812831/faster-way-to-delete-up-to-the-4th-colon-in-a-variablesemicolon
[xml]$clean = $pr.Substring($pr.IndexOf('<'))

#parsing the swtxnum - the select-xml works and takes about the same time, 

#but showing other options via the pre-xml header.
#this would work too: [string]$swtxnum = select-xml -xml $clean -XPath "/msmg/Data/myid"
$myid = $pr.Split(":",5)[2]

select-xml -xml $clean -XPath "/msmg/Data" | foreach {$_.node } | select @{Name="myid"; Expression={$myid}} , fielda, fieldb, fieldc, fieldd, fielde  #| ` # select is optional

}

$var|out-gridview

Friday, November 21, 2014

[SQLSaturday] 2 more coming in Dec/Jan (DC, Austin)

Yup, there are two more SQL Saturdays coming up with me in it! 

"More of me to see" - 2 different SQL Saturdays, and... um.... I'm not able to ride my bike as often.  But the chocolate is still there anyhow.

SQL Saturday 347, December 6 2014: Washington D.C.

SQL Saturday 362, January 31 2015: Austin, TX

Hope to see y'all there!

Thursday, October 2, 2014

[Extended Events] converting results of rpc_completed , originally VARBINARY, into XML

Maybe there's a better way to do it.  After futzing with various combinations for an hour, I gave up.

What I'm trying to do:
we have a stored procedure called from a custom app.  Pulls off a Queue, then runs the stored procedure using the contents of the Queue (which is XML).

When I have 1, it's really easy:

declare @msg varbinary
set @msg = 0x300000000134985709sdg8fs0d9f8fa0s7df90 (which would be hex in the real world)
select convert(xml, @msg)

However, I needed to do that for several hundred rows.

So,  the conversion from the native to varbinary(max) was failing - if you CAST or CONVERT it, it actually seems to CHANGE it... which causes the field to no longer be proper XML.  And so the CAST to XML fails.

And thus.... the fix is..... Dynamic SQL (yeah, I cringed too)


[...code to extract the varbinary string removed - multiple CTEs]
create table finished_trace (data_value xml)

DECLARE @min int, @max INT, @sql NVARCHAR(MAX)
SELECT @min = MIN(event_number) , @max = MAX(event_number) FROM #t3
WHILE @min <= @max
BEGIN
    SET @sql = NULL
    SELECT @sql = 'declare @msg varbinary(max)
    set @msg = ' + data_value + '
    select convert(xml, @msg)' FROM #t3 WHERE event_number = @min
    INSERT INTO finished_trace (data_value)
    EXEC (@sql)
    SET @min = @min + 1
end

Wednesday, September 17, 2014

[Event Notifications] my current slide deck

Thought I'd share this - the entire folder I run off of is shared, but here's the slide deck, embedded.  It should get new changes as I make them. 


Tuesday, September 16, 2014

[Event Notifications] adding a new box to your environment

Short and sweet - just realized I don't already have this on here.  This makes several assumptions in my environment, and you can modify as needed . Yes, it's ugly.  Yes, security holes due to xp_cmdshell and the like.  You've been warned.

Practically: you run this on the box that already has the repository.  You tell it the name of the new server.  It connects to that server, makes sure each side can reach the other, then enables Event Notifications on that new server.  It ends by making a change, waiting several seconds, then selecting from your repository table.  If it shows, you're golden.  If not, time to roll up sleeves and figure it out.


--reminder, this runs on the central monitorING box, and uses xp_cmdshell and the like to connect to the "new" box.
/*
Event Notification - add server to monitoring. Probably simpler to do by hand, but I like this.
See http://thebakingdba.blogspot.com/ for more information, look for Event Notifications
Given a parameter of servername, WHEN RUN ON THE EN "repository", it will
a) try to determine the servername for the "sender"
b) get the service_broker_guid from that server for MSDB
c) get the service_broker_guid from the local host
d) create a return route on the repository for that host
e) on the far side, build the endpoint, route, and event notification, and send a test record
f) verify we are receiving that message
Please read Version Info for potential issues.
2012/08/28 mdb - version 1.0
2012/11/15 mdb - 1.1, deals with machines that already have service broker by using the existing TCP endpoint; more comments
2013/01/22 mdb - 1.11 fix for 1.1; only SB machines would have worked.
2013/02/12 mdb - 1.2 adding "test" vs "real" flag to actually deploy, or just tell you what to do; adding authentication perms
Permissions code from Arshad Ali @ http://www.databasejournal.com/features/mssql/understanding-sql-server-service-broker-authentication.html
Service code name from Namwar Rizvi http://sqltips.wordpress.com/2007/08/06/retrieve-sql-server-service-account-name-from-registry-through-tsql/
Please note BOTH MACHINES MUST BE IN THE SAME DOMAIN or there must be trust between the domains, otherwise you can't send messages.
Also assumes that the service account IS A DOMAIN ACCOUNT.
2013/02/27 mdb - 1.21 more perms work; if not the same account, then create necessary Login and Grant Connect. Assumes
2013/03/20 mdb - 1.22 fixed propertyvalue code, which showed that some events weren't being added properly.
2013/03/21 mdb - 1.25 lots of cleanups. Ping test from Watched Server, checking for Local Auth, etc.
2013/03/26 mdb - 1.26 added "UPDATE" to fix any problems, found bug with version info. Fixed.
2014/04/01 mdb - 1.27 added code to deal with AO & fix bug; these days there can be more than one tcp endpoint.
2014/07/01 mdb - 1.28 adding code
2014/09/11 mdb - 1.29 adding new group for SQL Server 2014.
2015/03/30 mdb - 1.30 removing the new group, since it's actually already included under security, so it throws a "dupe" error.
--note that because we're running XP_cmdshell to ping far side servers, connections must actually work. Trusted for delegation and kerberos and the like
*/
/*
Troubleshooting Service Broker Conversations
http://rusanu.com/2005/12/20/troubleshooting-dialogs/
http://rusanu.com/2007/11/28/troubleshooting-dialogs-the-sequel/ (ssbdiagnose)
http://blog.maskalik.com/sql-server-service-broker/troubleshooting-external-activation
*/
--run this code on the server that is your EN repository
use eventnotificationrec
go
declare @servername sysname,
@distinct_servername sysname,
@hostname sysname,
@sql nvarchar(2000),
@cmd nvarchar(2000),
@local_guid varchar(36),
@new_source_GUID varchar(36),
@port varchar(6),
@productversion VARCHAR(20),
@existing_port varchar(6),
@en_repository_server varchar(20) --name of the server that will receive all the data
, @deploy VARCHAR(10)
, @do_permissions BIT
, @repository_service_name VARCHAR(100)
, @watched_server_service_name VARCHAR(100)
if object_id('tempdb..#get_new_server_info') is not null
DROP TABLE #get_new_server_info
CREATE TABLE #get_new_server_info (id INT IDENTITY, [OUTPUT] varchar(255))
SET @en_repository_server = @@SERVERNAME--'WIN-NDTAUPQET7L'
set @servername = 'newservername' --the server you're adding
SET @deploy = 'update' --REAL to actually run all the commands; otherwise it just prints out statements.
--UPDATE to update the rules on boxes, using the current strings. Does NOT change Endpoints, etc.
--FAKE otherwise
SET @do_permissions = 1 --1 to do them, 0 to skip. Only needed if service accounts between Repository & Watched Server are different. Code will grab logins for both, and compare.
--default is 0 because INSTANCES DON"T CURRENTLY WORK. If no instances, 1 is fine. Will fix, but need this now. Probably use a WMI call to get all the instance names.
-- either way, we check the default instance's name, because we don't want to put it on a server with local authentication
--------------
--PING TEST - make sure the watched server can connect to the repository. For one-way routing issues.
--REQUIRES XP_CMDSHELL be enabled on the watched server; need to add code to turn it on/off as needed
--------------
SET @sql = 'xp_cmdshell ''ping ' + @en_repository_server + ''''
SET @cmd = 'sqlcmd -Q "' + @sql + '" -E -d master -S ' + @servername
PRINT @cmd
INSERT INTO #get_new_server_info
EXEC xp_cmdshell @cmd
IF (SELECT SUM (CHARINDEX('(0% loss)',[OUTPUT])) FROM #get_new_server_info) >0
BEGIN
PRINT 'ping test successful - can reach server'
END
ELSE
BEGIN
PRINT 'ping test failed - cannot connect! (xp_cmdshell off? login failed?) Setting @deploy to FAKE so it doesn''t run commands'
IF @deploy = 'real'
BEGIN
SET @deploy = 'fake'
END
SELECT * FROM #get_new_server_info
END
TRUNCATE TABLE #get_new_server_info
--Retrieve the Service accounts from registry. The person running the script must have Windows admin rights on the box(?)
IF @deploy = 'real' --changed from checking do_permissions, since we need this to make sure it will succeed
BEGIN
--get local SQL service's name; needed so we know whether we're adding perms
EXECUTE master.dbo.xp_instance_regread
N'HKEY_LOCAL_MACHINE',
N'SYSTEM\CurrentControlSet\Services\MSSQLSERVER',
N'ObjectName',
@repository_service_name OUTPUT,
N'no_output'
--get the service name for the Watched server; if different network, probably won't work.
-- if same domain, make sure it's given perms.
select @cmd = 'sqlcmd -Q "declare @repository_service_name varchar(100); EXECUTE master.dbo.xp_instance_regread N''HKEY_LOCAL_MACHINE'', N''SYSTEM\CurrentControlSet\Services\MSSQLSERVER'', N''ObjectName'', @repository_service_name OUTPUT, N''no_output''; select @repository_service_name" -E -S ' + @servername
INSERT INTO #get_new_server_info
EXEC xp_cmdshell @cmd
SELECT @watched_server_service_name = output FROM #get_new_server_info
WHERE OUTPUT NOT LIKE '(%'
AND OUTPUT NOT LIKE '--%'
AND OUTPUT IS NOT NULL
AND (LTRIM(RTRIM(output))) <> ''
TRUNCATE TABLE #get_new_server_info
END
--come up with a port for the server; use 4022 by default, 4023 if it's an instance.
if charindex('\',@servername) = 0
begin
select @port='4022'
end
else
begin
select @port = '4023'
end --need code for more than 1 named instance on a server
--we need three different names - the service name, a unique name for the route, and the hostname for tcpip
select @distinct_servername = replace(@servername,'\','__'),
@hostname = case charindex('\',@servername)
when 0 then @servername
when null then @servername
else left(@servername,charindex('\',@servername)-1)
end
--get our Service Broker's GUID so we can add it to their route.
select @local_guid = service_broker_guid from sys.databases where name = 'EventNotificationRec'
--trying to get the GUID for the far-side server. 3rd row should be it, but since it's unordered we're doing it this way
select @cmd = 'sqlcmd -Q "SELECT service_broker_guid FROM sys.databases WHERE name = ''msdb''" -E -S ' + @servername
INSERT INTO #get_new_server_info
EXEC xp_cmdshell @cmd
SELECT @new_source_GUID = [output] FROM #get_new_server_info where [output] like '%-%-%-%-%' and [output] not like '%--%'
--getting the productversion, so we can add/remove EN groups as needed
TRUNCATE TABLE #get_new_server_info
select @cmd = 'sqlcmd -Q "SELECT SERVERPROPERTY(''productversion'')" -E -S ' + @servername
INSERT INTO #get_new_server_info
EXEC xp_cmdshell @cmd
SELECT @productversion = [OUTPUT] FROM #get_new_server_info WHERE ISNUMERIC(LEFT(output,3))= 1
--last chance to set to fake before we start adding things.
IF @watched_server_service_name IN ('NT AUTHORITY\NetworkService', 'LocalSystem')
BEGIN
PRINT 'Warning - Watched Service is using local authentication.
Unless you have granted public access, this will not work!
setting @deploy = ''fake'''
SET @deploy = 'fake'
END
---------------------------------------------------------
--Find port if TCP endpoint already exists on machine. --
-- If it does, use it (there can be only one). --
-- For machines with cross-server SB already in use. --
-- If not, use the assigned @port --
---------------------------------------------------------
select @cmd = 'sqlcmd -Q "set nocount on;SELECT port FROM master.sys.tcp_endpoints where port>0 and type_desc = ''SERVICE_BROKER''" -E -S ' + @servername
TRUNCATE TABLE #get_new_server_info
INSERT INTO #get_new_server_info
EXEC xp_cmdshell @cmd
SELECT @existing_port = LTRIM(RTRIM([OUTPUT])) FROM #get_new_server_info
WHERE [OUTPUT] NOT IN ('port ','-----------')
AND [OUTPUT] IS NOT NULL
DROP TABLE #get_new_server_info
--create route on the Repository server first. If new server already has an endpoint, use that, otherwise use the @port
IF NOT EXISTS(SELECT * FROM sys.routes WHERE name = 'ENAudit_ReturnRoute_' + @distinct_servername)
BEGIN
select @sql = 'CREATE ROUTE [ENAudit_ReturnRoute_' + @distinct_servername + ']
WITH SERVICE_NAME = ''http://schemas.microsoft.com/SQL/Notifications/EventNotificationService'',
BROKER_INSTANCE = ''' + @new_source_guid + ''',
ADDRESS = ''TCP://' + @hostname + ':' + COALESCE(@existing_port,@port) + ''';'
PRINT @sql
IF @deploy ='REAL'
BEGIN
exec sp_executesql @sql --this one runs locally, so we use sp_executesql; WORKS and has been done
END
END
----------------------------------
--Setup remote "sender" endpoint--
--uses SQLCMD --
----------------------------------
--mdb 2012/01/22 adding ISNULL for @existing_port; prevented NON-service-broker DBs from getting an endpoint. Rookie mistake.
--mdb 2013/02/05 still not working right with existing SB servers
--IF @port <> ISNULL(@existing_port,0)
IF ISNULL(@existing_port,0) = 0 --if there is no existing SB
BEGIN
select @sql = null, @cmd = null
select @sql = 'CREATE ENDPOINT ENAudit_Endpoint STATE = STARTED '
+ 'AS TCP (LISTENER_PORT = ' + @port + ') '
+ 'FOR SERVICE_BROKER (AUTHENTICATION = WINDOWS,ENCRYPTION = SUPPORTED)'
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -d master -S ' + @servername
PRINT @cmd
IF @deploy ='REAL'
BEGIN
exec xp_cmdshell @cmd
END
--------------------------------
-- Grant login and connect to the foreign endpoint, so they can converse.
-- Code assumes you're using domain accounts, not machine accounts.
-- In order to avoid certificates, both must be same domain or domains must be trusted by each other.
-- If your machines use the same service account, it will work without this.
-- If the service names are not the same between the Watched & Repository server,
-- the Repository needs connect perms to the the Watched server & endpoint,
-- and the Watched service's login needs connect perms to the Repository.
-- More info at http://msdn.microsoft.com/en-us/library/ms166065(v=sql.90).aspx
--------------------------------
IF @do_permissions = 1 AND @repository_service_name <> @watched_server_service_name
AND @repository_service_name IS NOT NULL AND @watched_server_service_name IS NOT null
AND CHARINDEX('/', @repository_service_name)>0 AND CHARINDEX('/', @watched_server_service_name)>0 --make sure they are domain accounts
AND @watched_server_service_name <> 'NT AUTHORITY\NetworkService'
BEGIN
PRINT 'updating/adding perms'
select @sql = null, @cmd = null
--first do the far side
select @sql = 'create login [' + @repository_service_name + '] from WINDOWS;'
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -d master -S ' + @servername
PRINT @cmd
IF @deploy ='REAL'
BEGIN
exec xp_cmdshell @cmd
END
select @sql = 'grant connect on endpoint::ENAudit_Endpoint to [' + @repository_service_name + ']'
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -d master -S ' + @servername
PRINT @cmd
IF @deploy ='REAL'
BEGIN
exec xp_cmdshell @cmd
END
--Now the Repository
select @sql = 'create login [' + @watched_server_service_name + '] from WINDOWS;'
PRINT @sql
IF @deploy ='REAL'
BEGIN
exec sp_executesql @sql --this one runs locally, so we use sp_executesql; WORKS and has been done
END
select @sql = 'grant connect on endpoint::ENAudit_Endpoint to [' + @watched_server_service_name + ']'
PRINT @sql
IF @deploy ='REAL'
BEGIN
exec sp_executesql @sql --this one runs locally, so we use sp_executesql; WORKS and has been done
END
END
END
-----------------------------------
-- Set up the Route in MSDB, as --
-- it has SB enabled by default --
-----------------------------------
select @sql = null, @cmd = null
select @sql = 'CREATE ROUTE [ENAudit_DestinationRoute]'
+' WITH SERVICE_NAME = ''ENAudit_Service'','
+' BROKER_INSTANCE = ''' + @local_guid + ''' , '
+' ADDRESS = ''TCP://' + @en_repository_server + ':4022'';' --that's the listener port on the _Repository_ box
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -d msdb -S ' + @servername
PRINT @cmd
IF @deploy ='REAL'
BEGIN
exec xp_cmdshell @cmd --this one runs locally, so we use sp_executesql; WORKS and has been done
END
-----------------------------
-- Now create the actual Event Notification.
-- Theoretically you can create this using the system tables, but it's different
-- between versions of SQL and so this is easier and only has to be updated each major SQL release.
-----------------------------
select @sql = null, @cmd = null
--if we're using UPDATE, drop the old one so we can add the new one.
select @sql = 'DROP EVENT NOTIFICATION [ENAudit_ServerDDLEvents] ON SERVER'
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -d msdb -S ' + @servername
PRINT @cmd
IF @deploy ='Update'
BEGIN
exec xp_cmdshell @cmd --this one runs locally, so we use sp_executesql; WORKS and has been done
END
select @sql = null, @cmd = null
--You could just do DDL_SERVER_LEVEL_EVENTS and ALTER_SERVER_CONFIGURATION, but that includes stats.
--full list of EN groups: http://msdn.microsoft.com/en-us/library/ms180824(v=sql.90).aspx
--can also add a variety of traces: see http://msdn.microsoft.com/en-us/library/ms187476(v=sql.105).aspx
select @sql = 'CREATE EVENT NOTIFICATION [ENAudit_ServerDDLEvents] '
+'ON SERVER '
--If you want _everything_ (including statistics!), just use the below line. Simpler to type, but more impact.
-- You could filter them via the enaudit_exclusion_list, but it's still more impact on the initiator;
-- doubly so if you add the TRC events since they include stats as well.
--+ 'FOR DDL_EVENTS, ALTER_SERVER_CONFIGURATION '
-- Note that sp_rename doesn't fire EN!
--using the chart http://msdn.microsoft.com/en-us/library/ms180824(v=sql.90).aspx which is 2005,
-- but there's a query to get them all (note it changes each edition): http://msdn.microsoft.com/en-us/library/bb510452(v=sql.105).aspx
/*
--for a to-be-written dynamic creation script. Current version works on 2005/2008/r2/2012.
--Take this query, run it against the sender in order to get all the server-level events.
-- Exclude DDL_TABLE_VIEW_EVENTS, then add all the subcomponents for DDL_TABLE_VIEW_EVENTS except STATISTICS
WITH DirectReports(name, parent_type, type, level, sort, type_name) AS
(
SELECT CONVERT(varchar(255),type_name), parent_type, type, 1, CONVERT(varchar(255),type_name), type_name
FROM sys.trigger_event_types
WHERE parent_type IS NULL
UNION ALL
SELECT CONVERT(varchar(255), REPLICATE ('| ' , level) + e.type_name),
e.parent_type, e.type, level + 1,
CONVERT (varchar(255), RTRIM(sort) + '| ' + e.type_name),
e.type_name
FROM sys.trigger_event_types AS e
INNER JOIN DirectReports AS d
ON e.parent_type = d.type
)
SELECT parent_type, level, type, name, type_name
FROM DirectReports
ORDER BY sort;
*/
--Server-level events, including config; SQL Server 2005 and above
+'FOR DDL_SERVER_LEVEL_EVENTS,'
--DDL_TABLE_VIEW_EVENTS, minus Statistics which are constantly updated and constantly created, even on temp objects created by 3rd party tools
+'DDL_TABLE_EVENTS,DDL_VIEW_EVENTS,DDL_INDEX_EVENTS,'
--and the rest of the DDL_DATABASE_LEVEL_EVENTS
+'DDL_SYNONYM_EVENTS,DDL_FUNCTION_EVENTS,DDL_PROCEDURE_EVENTS,'
+'DDL_TRIGGER_EVENTS,DDL_EVENT_NOTIFICATION_EVENTS,DDL_ASSEMBLY_EVENTS,'
+'DDL_TYPE_EVENTS,DDL_DATABASE_SECURITY_EVENTS,DDL_SSB_EVENTS,'
+'DDL_XML_SCHEMA_COLLECTION_EVENTS,DDL_PARTITION_EVENTS'
--and the events added in SQL Server 2008
IF @productversion NOT LIKE '9.%'
SET @sql = @sql +
+',DDL_PLAN_GUIDE_EVENTS,DDL_DEFAULT_EVENTS,DDL_EXTENDED_PROPERTY_EVENTS'
+',DDL_FULLTEXT_CATALOG_EVENTS,DDL_FULLTEXT_STOPLIST_EVENTS'
+',DDL_RULE_EVENTS,RENAME'
--dupes because they are sub-events of ddl_server_level_events
--+',ALTER_INSTANCE,DDL_DATABASE_EVENTS'
--+',DDL_ENDPOINT_EVENTS,DDL_EVENT_SESSION_EVENTS,DDL_EXTENDED_PROCEDURE_EVENTS'
--+',DDL_LINKED_SERVER_EVENTS,DDL_MESSAGE_EVENTS,DDL_REMOTE_SERVER_EVENTS'
--+',DDL_RESOURCE_GOVERNOR_EVENTS,DDL_SERVER_SECURITY_EVENTS'
--and the 1 event added in SQL Server 2008 R2; using NOT LIKE to auto-include future versions
IF @productversion NOT LIKE '9.%' AND @productversion NOT LIKE '10.0%'
SET @sql = @sql + ',ALTER_SERVER_CONFIGURATION'
--and the events added in SQL Server 2012; using NOT LIKE to auto-include future versions
IF @productversion NOT LIKE '9.%' AND @productversion NOT LIKE '10%'
SET @sql = @sql + ',DDL_SEARCH_PROPERTY_LIST_EVENTS,DDL_SEQUENCE_EVENTS'
--and now 2014! Only one this time, DDL_DATABASE_AUDIT_EVENTS (oh, the irony); using NOT LIKE to auto-include future versions
--not needed - database audit events fall under the security group, which is already added.
--IF @productversion NOT LIKE '9.%' AND @productversion NOT LIKE '10%' AND @productversion NOT LIKE '11%'
--SET @sql = @sql + ',DDL_DATABASE_AUDIT_EVENTS '
--trace events - TRC_OBJECTS is just alter/create/drop, added since they have HOSTNAME
SET @sql = @sql
+ ',TRC_OBJECTS, ERRORLOG' --EVENTLOG ,TRC_DEPRECATION,.... Deprecation at your own risk. Be CAREFUL.
+' TO SERVICE ''ENAudit_Service'', ''' + @local_guid + ''';' --central dbid for EventNotification_Receiver
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -d msdb -S ' + @servername
PRINT @cmd
IF @deploy IN ('REAL','UPDATE')
BEGIN
exec xp_cmdshell @cmd --this one runs locally, so we use sp_executesql; WORKS and has been done
END
WAITFOR DELAY '00:00:05'
---------------------------------
--create and drop test objects --
-- and ensure it's working. --
---------------------------------
select @sql = 'create table tempdb.dbo.test_for_audit (field_a int)'
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -S ' + @servername
PRINT @cmd
IF @deploy IN ('REAL','UPDATE')
BEGIN
exec xp_cmdshell @cmd --this one runs locally, so we use sp_executesql; WORKS and has been done
END
select @sql = 'drop table tempdb.dbo.test_for_audit'
select @cmd = 'sqlcmd -Q "' + @sql + '" -E -S ' + @servername
IF @deploy IN ('REAL','UPDATE')
BEGIN
exec xp_cmdshell @cmd --this one runs locally, so we use sp_executesql; WORKS and has been done
END
ELSE
BEGIN
PRINT @cmd
END
IF @deploy IN ('REAL','UPDATE')
BEGIN
WAITFOR DELAY '00:00:10'
select top 100 * from ENAudit_Events WHERE PostTime > GETDATE()-.01 AND ServerName = @servername
END
/*
--sqlcmd -Q "CREATE EVENT NOTIFICATION [ENAudit_ServerDDLEvents] ON SERVER FOR DDL_SERVER_LEVEL_EVENTS,DDL_TABLE_EVENTS,DDL_VIEW_EVENTS,DDL_INDEX_EVENTS,DDL_SYNONYM_EVENTS,DDL_FUNCTION_EVENTS,DDL_PROCEDURE_EVENTS,DDL_TRIGGER_EVENTS,DDL_EVENT_NOTIFICATION_EVENTS,DDL_ASSEMBLY_EVENTS,DDL_TYPE_EVENTS,DDL_DATABASE_SECURITY_EVENTS,DDL_SSB_EVENTS,DDL_XML_SCHEMA_COLLECTION_EVENTS,DDL_PARTITION_EVENTS,TRC_OBJECTS, ERRORLOG TO SERVICE 'ENAudit_Service', '5E885B6D-D24D-4FEB-8E64-0F8D1DE16905';" -E -d msdb -S myservera
--sqlcmd -Q "CREATE EVENT NOTIFICATION [ENAudit_ServerDDLEvents] ON SERVER FOR DDL_SERVER_LEVEL_EVENTS,DDL_TABLE_EVENTS,DDL_VIEW_EVENTS,DDL_INDEX_EVENTS,DDL_SYNONYM_EVENTS,DDL_FUNCTION_EVENTS,DDL_PROCEDURE_EVENTS,DDL_TRIGGER_EVENTS,DDL_EVENT_NOTIFICATION_EVENTS,DDL_ASSEMBLY_EVENTS,DDL_TYPE_EVENTS,DDL_DATABASE_SECURITY_EVENTS,DDL_SSB_EVENTS,DDL_XML_SCHEMA_COLLECTION_EVENTS,DDL_PARTITION_EVENTS,TRC_OBJECTS, ERRORLOG TO SERVICE 'ENAudit_Service', '5E885B6D-D24D-4FEB-8E64-0F8D1DE16905';" -E -d msdb -S myserverb
THEY MATCH. So we can issue the same command and sync the events of each server.
sqlcmd -Q "drop EVENT NOTIFICATION [ENAudit_ServerDDLEvents] ON SERVER" -E -d msdb -S myserverb
sqlcmd -Q "CREATE EVENT NOTIFICATION [ENAudit_ServerDDLEvents] ON SERVER FOR DDL_SERVER_LEVEL_EVENTS,DDL_TABLE_EVENTS,DDL_VIEW_EVENTS,DDL_INDEX_EVENTS,DDL_SYNONYM_EVENTS,DDL_FUNCTION_EVENTS,DDL_PROCEDURE_EVENTS,DDL_TRIGGER_EVENTS,DDL_EVENT_NOTIFICATION_EVENTS,DDL_ASSEMBLY_EVENTS,DDL_TYPE_EVENTS,DDL_DATABASE_SECURITY_EVENTS,DDL_SSB_EVENTS,DDL_XML_SCHEMA_COLLECTION_EVENTS,DDL_PARTITION_EVENTS,TRC_OBJECTS, ERRORLOG TO SERVICE 'ENAudit_Service', '5E885B6D-D24D-4FEB-8E64-0F8D1DE16905';" -E -d msdb -S myserverb
backup database model to disk ='nul' --testing errorlog
sqlcmd -Q "SELECT service_broker_guid FROM sys.databases WHERE name = 'msdb'" -E -S VM2
8B85664B-24AB-4B5A-ABAA-0AE5301D4E2D
CREATE ROUTE [ENAudit_ReturnRoute_VM2]
WITH SERVICE_NAME = 'http://schemas.microsoft.com/SQL/Notifications/EventNotificationService',
BROKER_INSTANCE = '8B85664B-24AB-4B5A-ABAA-0AE5301D4E2D',
ADDRESS = 'TCP://VM2:4022';
sqlcmd -Q "CREATE ENDPOINT ENAudit_Endpoint STATE = STARTED AS TCP (LISTENER_PORT = 4022) FOR SERVICE_BROKER (AUTHENTICATION = WINDOWS,ENCRYPTION = SUPPORTED)" -E -d master -S VM2
sqlcmd -Q "CREATE ROUTE [ENAudit_DestinationRoute] WITH SERVICE_NAME = 'ENAudit_Service', BROKER_INSTANCE = '9CF39735-7E60-490F-92B6-26EBFEC19F5E' , ADDRESS = 'TCP://WIN-NDTAUPQET7L:4022';" -E -d msdb -S VM2
sqlcmd -Q "CREATE EVENT NOTIFICATION [ENAudit_ServerDDLEvents] ON SERVER FOR DDL_SERVER_LEVEL_EVENTS,DDL_TABLE_EVENTS,DDL_VIEW_EVENTS,DDL_INDEX_EVENTS,DDL_SYNONYM_EVENTS,DDL_FUNCTION_EVENTS,DDL_PROCEDURE_EVENTS,DDL_TRIGGER_EVENTS,DDL_EVENT_NOTIFICATION_EVENTS,DDL_ASSEMBLY_EVENTS,DDL_TYPE_EVENTS,DDL_DATABASE_SECURITY_EVENTS,DDL_SSB_EVENTS,DDL_XML_SCHEMA_COLLECTION_EVENTS,DDL_PARTITION_EVENTS,TRC_OBJECTS TO SERVICE 'ENAudit_Service', '9CF39735-7E60-490F-92B6-26EBFEC19F5E';" -E -d msdb -S VM2
-------------demo cleanup------------
--on vm2:
use master
go
drop event notification [ENAudit_ServerDDLEvents] ON SERVER
go
use msdb
go
drop route [ENAudit_DestinationRoute]
go
drop endpoint ENAudit_Endpoint
-----------------------------------
------ON LOCAL
use EventNotificationRec
drop route [ENAudit_ReturnRoute_VM2]
*/





Monday, September 15, 2014

[Event Notifications] Tackling Database Lifecycle Management and Database Drift (via Redgate)

First, I'd like to thank Bill Fellows, Megan Longoria, and the rest of the Kansas City crew for an amazing weekend at SQLSaturday #300, aka #SQLSatKC.  We hit 2 BBQ places with them, another BBQ place and Boulevard Brewing with some other attendees, then went to an amazing baseball game.  Saturday was a full day with some awesome training (everything from "What's a DBA" to collecting ultra-detailed memory-consumption information), even more awesome BBQ, and ended at the nearby amusement park. 

My presentation went quite well, aside from all 3(!) VMs deciding to BSOD on me 15 minutes before I went on.  Good number of attendees (always a concern for me, personally).  Had some good questions, I like the audience participation & humor I've added, and had a good conversation on it afterwards. Overall, a definite win.  Now I need a new presentation so I can come back next year!


Additionally...


Normally I wouldn't just link to someone else's article, but they are more eloquent than me.  A good article on why you should track database changes in production (like I do with SQL Watchdog, aka "Master of All I Survey", aka "Evil Overlord").  That being said, I love the term "Database Drift", and I'm glad to see more companies looking into solving this issue.

https://www.simple-talk.com/blogs/2014/09/10/tackling-database-drift-with-sql-lighthouse/#.VBbX7AevYv4.twitter

Tuesday, September 9, 2014

[Event Notifications] Getting auto-TFS checkin set up for TFS 2013 (SQLWatchdog)

We recently upgraded to TFS 2013, so here are the steps (written as I do it) to get it running.

Aside: Since I'm using Powershell to run all this, why do I use the TF.cmd application instead of the Cmdlets?  Simplicity.  The Cmdlets appear to need the FULL version of a TFS client installed, plus you have to grab and install the cmdlets... and there are bugs if you don't use the right (x32 vs x64) version of the cmdlets with the right version of powershell.  So I built it around the TF CLI (Command-Line Interface).  The original post was for 2008, the new one (plus script below) are for 2013.

IMPORTANT:
Upgrading from 2008 to 2013 blows, since the config and cache can be screwed up.
What I had to do was completely drop the workfold and the workspace (PROTIP: first, list your workfold and workspaces and SAVE THOSE DETAILS), delete the entirety of the cache folder, delete everything from my EN_Workspace folder on disk, then rebuild from scratch.
One other oddity: before, from the root of my installation (c:\tfs_cli) I could run the script, and it could properly list the folders underneath it.  That appears to have changed for some reason; my Powershell script now CDs into the subfolder before it proceeds.  Not a major change... but took several hours to diagnose, since I was also fighting with other issues on the setup.  Caveat Emptor and all that.

Also for the TFS newbies: the default path you use for a LOT of things is:
http://myservername:8080/tfs/DefaultCollection .

Steps to get going:
  • Install Java.  Got it from Oracle, which doesn't appear to have the same cruftware that Sun's does. Version 7-something
  • Grab Team Foundation Everywhere 2013. Reading the docs, it works for TFS 2010, 2012, 2013, and Online. http://www.microsoft.com/en-us/download/details.aspx?id=40785
  • Unzip into one massive folder (I called mine "c:\TFS_CLI\APP_2013"), then try running (since I'm on Server 2012, in Powershell):
    • ".\tf.cmd" 
    • it works - success!  Sorta.
  • Found this page which seems to make the below setup easier: 
    • http://msdn.microsoft.com/en-us/library/hh873092.aspx
  • For the rest of this, I'm doing it as the Service account that SQL Server is running under on the box that will run the job. Oddly enough, it wasn't set that way when I built it on 2008, but it continued to work even though my password changed.  Note that nowhere in here do you actually put in a password.
  • Type ".\tf eula", then accept ("y"). 
  • muddle through the web site (tfsservername:8080/tfs) and add the SQL Service account as an admin to the branch I'm using (YourENProject).
  • Using the MS page above:
    • .\tf workspace -new TF_Workspace -collection:http://mytfservername:8080/tfs/DefaultCollection
      • the collection URL is pretty standard - it should work in 2010,2012,2013.
    • .\tf workfold -map $/YourENProject -workspace:TF_Workspace c:\TFS_CLI\EN_Workspace
      • This line has several names.  Let's break them out.
      • $/YourENProject is the "Branch" in TFS.  Call it something like $/ProductionEN or something like that.  I created the branch using SSDT before I did any of this, but you can create one using the TF.exe
      • TF_Workspace is the workspace you created in the step above.
      • c:\TFS_CLI\EN_Workspace - the physical folder on your EN server (NOT the TFS server).  I named them differently to make it more obvious what each part of the command is doing.
    • now try and check something in.
      • Get to the workspace
        • cd c:\TFS_CLI\EN_Workspace
      • new-item test.txt -type file
      • &cmd /c "c:\tfs_cli\app_2013\tf add test.txt"
      • &cmd /c "c:\tfs_cli\app_2013\tf checkin /author:memyselfandi"
      • I use that weird syntax to catch errors and return as text into Powershell, so it can properly eat and pass on errors as necessary, as opposed to having PS choke on it.
      • Pull and update something: this is where I had massive problems. (use script)
    • Finally verify it worked (I used my "real" TFS client on my desktop).
Now, set up the SQL Script (SP), the Powershell script, and a job that calls the SP.

SP:
USE eventnotificationrec
GO
CREATE PROCEDURE [dbo].[EN_TFS_Checkin]
AS
/*
Purpose - using Event Notifications, determines which objects have changed (sub-database-level, for now).
Once list has been generated, repeatedly call a custom powershell script that will
"get" and "checkout" the object from TFS if it exists, overwrite it with a current copy (scripted via SMO),
then "checkin" the new object with a comment based on the loginname
mdb 2013/10/04 1.00 Works well (albeit a little slowly)and handled multiple days' worth of checkins.
Error items were due because of short-term objects that are created then dropped.
mdb 2013/10/11 1.10 Using TFSsecurity to find valid users. If user is valid, use their name to checkin.
mdb 2013/10/22 1.11 Changing INDEX to include table. Makes it easier to read in history.
mdb 2013/11/08 1.12 Let's handle extended properties!
mdb 2013/12/03 1.13 the PS code can now script out database creation. Adding things like ALTER_DATABASE,
and dealing with full-database checkins.
mdb 2013/12/04 1.14 fixing author bug
To be done still:
* Script server-level
* Database-level: perms and ALTER/CREATE_SCHEMA
* Logic for Drop and Renames. How do we want to handle that? (TF.exe RENAME, either to new name or "DROPPED - oldname")
* Better logic to skip events? We would conceivably need more table(s) to hold the exception types.
sample of items not yet done
add_role_member --must be at the database level(?)
ADD_SERVER_ROLE_MEMBER
ALTER_ASSEMBLY
ALTER_CREDENTIAL
*/
SET NOCOUNT ON
DECLARE @cmd VARCHAR(8000), @min INT, @max INT, @EventType VARCHAR(128), @Time_of_last_run datetime, @now DATETIME
, @loginname VARCHAR(128), @loginname_temp VARCHAR(128), @TargetObjectType VARCHAR(128)
--mdb 2013/11/08 we didn't add ObjectType as it would screw up our dupe-elimination, but I need TargetObjectType for ExtendedProperties
DECLARE @change_list TABLE (id INT IDENTITY, ServerName VARCHAR(128), EventType VARCHAR(128), LoginName VARCHAR(128)
, DatabaseName VARCHAR(128), SchemaName VARCHAR(128), ObjectName VARCHAR(128)
, TargetObjectName VARCHAR(128), TargetObjectType VARCHAR(128))
DECLARE @valid_tfs_logins TABLE (id INT IDENTITY, loginname VARCHAR(128))
IF OBJECT_ID('tempdb..#holding_tank_for_errors') IS NOT NULL
DROP TABLE #holding_tank_for_errors
CREATE TABLE #holding_tank_for_errors (id INT IDENTITY, resultant VARCHAR(800))
if object_id('tempdb..#tfssecurity') is not null
DROP TABLE #tfssecurity
CREATE TABLE #tfssecurity (id INT IDENTITY, resultant nVARCHAR(800))
--TFSSecurity. Ensures the user is valid by querying for "Team Foundation Valid Users" (though you may have other groups).
-- If valid, uses their name for checkin. If not, uses a default (set in PS) and adds it to the Comment
INSERT INTO #tfssecurity
(resultant)
EXEC master..xp_cmdshell 'C:\tfs_cli\app_2013\tfssecurity /imx "Team Foundation Valid Users" /collection:http://yourtfserver:8080/tfs/DefaultCollection'
--find only users, parsing into a comparable field.
INSERT INTO @valid_tfs_logins (loginname)
SELECT
SUBSTRING(resultant, CHARINDEX(']',resultant)+2, CASE CHARINDEX('(', resultant) WHEN 0 THEN 200 ELSE CHARINDEX('(', resultant) - CHARINDEX(']',resultant)-3 end)
FROM #tfssecurity
WHERE resultant LIKE '%[[]U]%'
--go back 1 day, the first time you run.
IF (SELECT COUNT(*) FROM eventnotificationrec.dbo.TFS_Last_Time_Run) = 0
BEGIN
INSERT INTO eventnotificationrec.dbo.TFS_Last_Time_Run VALUES (GETDATE()-1)
END
--find the last time we ran, so we're only getting in that range
SET @time_of_last_run = (select MAX(last_time_run)
FROM eventnotificationrec.dbo.TFS_Last_Time_Run)
--located here, as soon as possible before the query to get the list of events. We could change it to use the ID field instead.
SELECT @now = GETDATE()
SELECT @time_of_last_run AS last_run_time, @now AS now_run_time
-- can simplify this further.
-- Multiple changes to the same object don't need multiple checkins. Indexes, for instance, only need 1 checkin of the table.
-- will probably need a case statement since (for indexes, in this case) each would be a different object w/same targetobject
-- as of 1.1, each index change requires a separate checkin.
INSERT INTO @change_list
SELECT ServerName, EventType, LoginName, DatabaseName, SchemaName, ObjectName, TargetObjectName, TargetObjectType
FROM eventnotificationrec.dbo.ENAudit_View
WHERE EventType IN
(
--these lines are blocked out by the groupings below, use them to make it easier to read this.
'ALTER_FUNCTION','ALTER_PROCEDURE','ALTER_TABLE','ALTER_VIEW','CREATE_FUNCTION',
'CREATE_PARTITION_FUNCTION', 'CREATE_PARTITION_SCHEME', 'CREATE_PROCEDURE', 'CREATE_TABLE', 'CREATE_VIEW'
,'ALTER_TABLE'
,'Create_Type', 'Alter_Type'
,'ALTER_TRIGGER', 'ALTER_INDEX', 'ALTER_SPATIAL_INDEX'
,'CREATE_TRIGGER', 'CREATE_INDEX', 'CREATE_SPATIAL_INDEX'
,'ALTER_EXTENDED_PROPERTY','CREATE_EXTENDED_PROPERTY','DROP_EXTENDED_PROPERTY'
,'ALTER_AUTHORIZATION_DATABASE','ALTER_DATABASE'
)
AND NOT (EventType = 'ALTER_INDEX' AND commandtext LIKE '%reorg%') --ignore reorgs
AND NOT (EventType = 'ALTER_INDEX' AND commandtext LIKE '%REBUILD%' AND commandtext NOT LIKE '%fillfactor%')
AND insert_datetime > @time_of_last_run
AND insert_datetime <= @now
GROUP BY ServerName, EventType, LoginName, DatabaseName, SchemaName, ObjectName, TargetObjectName, TargetObjectType
ORDER BY MAX(insert_datetime)
select count(*) from @change_list AS number_of_items_to_do
--now that we have a list to process, invoke the powershell script for each.
--The powershell script does the work; this just tells it what's changed and needs to be done.
SELECT @min = MIN(id), @max = MAX(id) FROM @change_list
WHILE @min <= @max
BEGIN
SET @EventType = NULL
SET @loginname_temp = NULL
SET @loginname = NULL
SET @TargetObjectType = NULL
--Using TFSSecurity block above, change the loginname to be the actual loginname if they're a valid user
SELECT @loginname_temp = loginname FROM @change_list WHERE id = @min
IF EXISTS (SELECT * FROM @valid_tfs_logins WHERE loginname = @loginname_temp)
BEGIN
SET @loginname = ' -author "' + @loginname_temp + '" -comment "'
END
ELSE
BEGIN
SET @loginname = ' -comment "' + ISNULL(@Loginname_Temp,'blank.username') + ' ----- '
END
--clear the error trap - we have 2, one transient and one permanent
TRUNCATE TABLE #holding_tank_for_errors
--easier to select this once; makes below code more readable.
SELECT @EventType = EventType, @TargetObjectType = TargetObjectType FROM @change_list WHERE id = @min
--basic call, comment has loginname & eventtype
IF @EventType IN ('ALTER_FUNCTION','ALTER_PROCEDURE','ALTER_VIEW','CREATE_FUNCTION',
'CREATE_PARTITION_FUNCTION', 'CREATE_PARTITION_SCHEME', 'CREATE_PROCEDURE', 'CREATE_TABLE', 'CREATE_VIEW')
or (@EventType IN ('ALTER_EXTENDED_PROPERTY','CREATE_EXTENDED_PROPERTY','DROP_EXTENDED_PROPERTY') AND @TargetObjectType = '')
BEGIN
SELECT @cmd = 'powershell -file c:\tfs_cli\sql_script_to_tfs.ps1 -server ' + servername
+ ' -database "' + databasename + '" -schematoscript "' + SchemaName + '" -objecttoscript "' + ObjectName + '"'
+ @LoginName + EventType + ' ' + ObjectName
+ CASE
WHEN TargetObjectName = '' THEN ''
WHEN TargetObjectName IS NOT NULL THEN ' on ' + TargetObjectName
ELSE '' END + '"'
FROM @change_List WHERE id = @min
END
--when we want to include the actual change in the comment, to get specifics like ALTER TABLE ADD COLUMN
ELSE
IF @EventType IN ('ALTER_TABLE')
BEGIN
SELECT @cmd = 'powershell -file c:\tfs_cli\sql_script_to_tfs.ps1 -server ' + servername
+ ' -database "' + databasename + '" -schematoscript "' + SchemaName + '" -objecttoscript "' + ObjectName + '"'
+ @LoginName + EventType + ' ' + ObjectName + '"'
FROM @change_List WHERE id = @min
END
--objects without schemas, so searching the SMO to match SchemaName won't find anything
ELSE
IF @EventType IN ('CREATE_TYPE', 'ALTER_TYPE')
BEGIN
SELECT @cmd = 'powershell -file c:\tfs_cli\sql_script_to_tfs.ps1 -server ' + servername
+ ' -database ' + databasename + ' -objecttoscript ' + ObjectName
+ @LoginName + EventType + ' ' + ObjectName + '"'
FROM @change_List WHERE id = @min --when scripted, they have no schema
END
--objects that are actually "part" of another object, and thus use the TargetObject to determine what to script out.
ELSE
IF @EventType IN ('ALTER_TRIGGER', 'ALTER_INDEX', 'ALTER_SPATIAL_INDEX', 'CREATE_TRIGGER', 'CREATE_INDEX', 'CREATE_SPATIAL_INDEX')
or (@EventType IN ('ALTER_EXTENDED_PROPERTY','CREATE_EXTENDED_PROPERTY','DROP_EXTENDED_PROPERTY') AND @TargetObjectType <> '')
BEGIN
SELECT @cmd = 'powershell -file c:\tfs_cli\sql_script_to_tfs.ps1 -server ' + servername
+ ' -database "' + databasename + '" -schematoscript "' + SchemaName + '" -objecttoscript "' + TargetObjectName + '"'
+ @Loginname + EventType + ' ' + ISNULL(ObjectName,'') + '.' + TargetObjectName + '"'
FROM @change_List WHERE id = @min
END
--database-level objects, such as CREATE or changes to a database
ELSE
IF @EventType IN ('ALTER_AUTHORIZATION_DATABASE','ALTER_DATABASE','CREATE_DATABASE')
BEGIN
SELECT @cmd = 'powershell -file c:\tfs_cli\sql_script_to_tfs.ps1 -server ' + servername
+ ' -database "' + databasename + '"'
+ @LoginName + EventType + ' ' + databasename + '"'
FROM @change_List WHERE id = @min
END
--run the powershell command, saving errors to a temp table
RAISERROR (@cmd, 0, 1) WITH NOWAIT
INSERT INTO #holding_tank_for_errors
EXEC master..xp_cmdshell @cmd
--if any errors during execution, save to a permanent table to look at later.
IF EXISTS
(
SELECT *
FROM #holding_tank_for_errors
WHERE resultant LIKE '%sql_script_to_tfs.ps1%'
)
BEGIN
INSERT INTO eventnotificationrec.dbo.TFS_Checkin_Errors (time_job_run, original_id, resultant)
SELECT GETDATE(), id, resultant
FROM #holding_tank_for_errors ORDER BY id
END
SET @min = @min + 1
END
--and now that we're successful, add a record so we know when we ran.
INSERT INTO eventnotificationrec.dbo.TFS_Last_Time_Run (last_time_run) VALUES (@now)
GO
view raw EN_To_TFS_2 hosted with ❤ by GitHub





 Powershell:
<#
.SYNOPSIS
Will script an object from SQL Server and CHECKIN/ADD to TFS.
.EXAMPLE
sql_script_to_TFS.ps1 -server yourservername -Database yourdatabasname -ScriptType "FULL" -Author yourTFSname -Comment "full checkin of database"
#>
# Purpose - given parameters, script out an object from a SQL Server, using SMO, then check it into TFS.
#Scripting code gleefully copied from Phil Factor - the great parts that work are his, the errors are mine
# https://www.simple-talk.com/sql/database-administration/automated-script-generation-with-powershell-and-smo/
#Other code from others copied and attributed below.
#mdb 2013/12/03 adding ScriptType so that we can add "create database"-specific changes ONLY.
# To do FULLs you will have to have -ScriptType = 'Full' parameter. By default it will do objects.
#mdb 2013/12/05 trying to better bullet-proof the app against TF failures, which appear to occur randomly
#mdb 2013/12/07 works great, though occasionally it cant script out an item. Trying to bulletproof that
#mdb 2013/12/16 broke the individual-item portion, fixed.
#mdb 2013/12/17 more cleanup and dealing with no objects found.
#mdb 2014/09/08 removing .exe references, and changing folder name, so 2013 works.
#mdb 2018/03/06 the -notmatch doesn't include usp_M* under certain circumstances. Changing to notlike
#param has to be the first line of the script.
param(
[Parameter(Mandatory=$true,Position=1)]
[string]$Server,
[Parameter(Mandatory=$true,Position=2)]
[string]$Database,
[string]$ScriptType ='Object',
[string]$SchemaToScript,
[string]$ObjectToScript,
[string]$Author,
[string]$Comment
)
#make sure not ObjectToScript blank if scripting out the entire DB; makes checkin better
#these are the parameters for testing
#$Server='sql_repository'# the server it is on
#$Database='model' # the name of the database you want to script as objects
#$SchemaToScript = 'dbo'
#$ObjectToScript = 'spy_tempdef'
#$Author = 'michael.bourgon'
#$ScriptType ='Object'
#$Comment = 'bourgon_test'
#setting up an error code for later
$myerror = 0
cd c:\tfs_cli\en_workspace
if ($comment -eq '')
{ $comment = "generic EN checkin"}
if ($author -eq '')
{ $author = "erxnetwork\sqlservice"}
#field is mandatory, if we dont know, use a known-valid.
$ServerNameClean = "$($Server -replace '[\\\/]','__')"
clear
#writing this out for logging and troubleshooting. These are all the parameters except ScriptType
write-host $Server, $Database, $SchemaToScript, $ObjectToScript, $Author, $Comment
#TFS workspace folder - whatever you set it up as on your server
$DirectoryToSaveTo='C:\TFS_CLI\EN_Workspace' # the directory where you want to store them
# Load SMO assembly, and if we are running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null
}
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SmoEnum') | out-null
set-psdebug -strict # catch a few extra bugs
$ErrorActionPreference = "stop"
$My='Microsoft.SqlServer.Management.Smo'
$srv = new-object ("$My.Server") $Server # attach to the server
if ($srv.ServerType-eq $null) # if it managed to find a server
{
Write-Error "Sorry, but I could not find Server '$Server' "
return
}
$scripter = new-object ("$My.Scripter") $srv # create the scripter
#Add the various options we care about
$scripter.Options.ToFileOnly = $true
$scripter.Options.ExtendedProperties= $true # yes, we want these
$scripter.Options.DRIAll= $true # and all the constraints
$scripter.Options.Indexes= $true # Yup, these would be nice
$scripter.Options.Triggers= $true # This should be includede
$scripter.Options.AppendToFile = $False
$scripter.Options.AllowSystemObjects = $False
$scripter.Options.ClusteredIndexes = $True
$scripter.Options.DriAll = $True
$scripter.Options.ScriptDrops = $False
$scripter.Options.IncludeHeaders = $False #so you do not get the one line "scripted at..." which would be NOW.
#$scripter.Options.ToFileOnly = $True
#$scripter.Options.Indexes = $True
$scripter.Options.Permissions = $True
$scripter.Options.WithDependencies = $False
$scripter.Options.Bindings = $true
$scripter.Options.IncludeDatabaseRoleMemberships = $true
#################################################
#First, script out the database "create" itself.#
#################################################
if (($Database) -and $ObjectToScript -eq '') #if database has a value but there is no object, script out the DB.
{
$db_scripter = new-object ("$My.Scripter") $srv # script out the database creation
$db_scripter.options=$scripter.options # with the same options
$db_scripter.options.filename="$($DirectoryToSaveTo)\$($ServerNameClean)\$($Database)\$($Database)_database_create.sql" # with the same options
#explcitly creating the path for the DB script here. We still need to do it for all the sub-types.
# Could probably move the $d declaration up here, but leaving it here for readability.
$SavePath="$($DirectoryToSaveTo)\$($ServerNameClean)\$($Database)"
# create the directory if necessary (SMO does not).
if (!( Test-Path -path $SavePath )) # create it if not existing
{Try { New-Item $SavePath -type directory | out-null }
Catch [system.exception]{
Write-Error "error while creating '$SavePath' $_"
return
}
}
#Use TF to see if the object exists on our TFS server.
# Optimization idea: DIR the entire subfolder on a FULL and compare all at once.
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF dir $/Randolph/$ServerNameClean/$Database/$($Database)_database_create.sql 2>&1"
# Running the TF calls this way as per Simon Ejsing to ignore the error state and capture the actual error message
# http://stackoverflow.com/questions/2095088/error-when-calling-3rd-party-executable-from-powershell-when-using-an-ide
# However, that also means we need
#Note that if the database create has not changed, it will still attempt to CHECKIN, but TFS will ignore as it is the same.
if ($tf -like "No items match*" -or $tf -like "*is not found or not supported*")
{
"database script does not exist; scripting out and ADDing to TFS"
if(Test-Path -Path $db_scripter.options.filename)
{
#delete the file manually, since we have seen permission issues where the $script cannot overwrite.
$deleteme = "$SavePath\$($Database)_database_create.sql"
$deleteme
try
{
remove-item "$SavePath\$($Database)_database_create.sql" -force
}
catch
{
$error[0].Exception
}
}
#putting in a try/catch so we get error messages if it breaks, and it can continue.
try
{
$db_scripter.Script($srv.Databases[$Database]) # do it
}
Catch
{
"Error Message trying to script out $SavePath\$Filename"
$error[0].Exception
}
"database create script done"
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF add $/Randolph/$ServerNameClean/$Database/$($Database)_database_create.sql 2>&1"
$tf
#use mass checkin at the end
}
else
{
"database script exists; get, check out, script to override, check in"
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF get ""$/Randolph/$ServerNameClean/$Database/$($Database)_database_create.sql"" /noprompt 2>&1"
"database script GET results"
$tf
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF checkout ""$/Randolph/$ServerNameClean/$Database/$($Database)_database_create.sql"" 2>&1"
"database script CHECKOUT results"
$tf
"database checkout done"
#If the file exists, manually delete; we have seen permission issues where $script cannot overwrite.
if(Test-Path -Path $db_scripter.options.filename)
{
$deleteme = "$SavePath\$($Database)_database_create.sql"
$deleteme
try
{
#bug exists with standard remove - if there are read-only items in the same folder, -force is required
remove-item "$SavePath\$($Database)_database_create.sql" -force
}
catch
{
$error[0].Exception
}
}
#putting in a try/catch so we get error messages if it breaks, and it can continue.
try
{
$db_scripter.Script($srv.Databases[$Database]) # do it
}
Catch
{
"Error Message trying to script out $SavePath\$Filename"
$error[0].Exception
}
"database script out done"
#use mass checkin at the end
}
}
###########################
## Scripting out Objects ##
###########################
# we now get all the object types except extended stored procedures
# first we get the bitmap of all the object types we want
$all =[long] [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::all `
-bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ExtendedStoredProcedure
# and we store them in a datatable
$d = new-object System.Data.Datatable
# get almost everything; skipping most service broker, information_schema, system_views, certificates (cannot be scripted)
# there are other items that may need to be skipped, like SymmetricKeys in SSISDB
# Yes, I realize the irony in skipping SB given that it is powering this.
#putting in a try/catch so we get error messages if it breaks, and it can continue.
try
{
$d=$srv.databases[$Database].EnumObjects([long]0x1FFFFFFF -band $all) | `
Where-Object {$_.Schema -ne 'sys'-and $_.Schema -ne "information_schema" -and $_.DatabaseObjectTypes -ne 'ServiceBroker' `
-and $_.DatabaseObjectTypes -ne 'Certificate' `
-and $_.DatabaseObjectTypes -ne 'MessageType' `
-and $_.DatabaseObjectTypes -ne 'ServiceContract' `
-and $_.DatabaseObjectTypes -ne 'ServiceQueue' `
-and $_.DatabaseObjectTypes -ne 'ServiceRoute' `
-and ($SchemaToScript -eq '' -or $_.Schema -eq $SchemaToScript) `
-and (($ObjectToScript -eq '' -and $ScriptType -eq 'Full') -or $_.Name -eq $ObjectToScript) `
-and ($_.Name -notlike 'sp_MS*') }
# mdb 2013/11/07 previous line skips replication objects. This comment below code as comment lines break extended 1-liner.
}
Catch
{
"Error Message trying to enumerate the database - may be logshipped or being restored"
$myerror = 1
$error[0].Exception
}
# List every item that we are going to do
$d = $d | sort -Property DatabaseObjectTypes,Schema,Name
$d | select databaseobjecttypes, schema, name
if ($d.Count -gt 10000)
{
"skipping the database objects - more than 10000"
}
# Now write out each scriptable object as a file in the directory you specify
#it appears that an empty array never actually enters the FOREACH, leaving variables unset
# -and -$d.Count -ne 0
if ($myerror -eq 0 -and $d.Count -lt 10001) #20k of objects takes up 5gb of RAM in the PS script and causes problems
{
$d| FOREACH-OBJECT { # for every object we have in the datatable.
"" #blank line so each block of error messages is separated out
$SavePath="$($DirectoryToSaveTo)\$ServerNameClean\$Database\$($_.DatabaseObjectTypes)"
# create the directory if necessary (SMO does not).
if (!( Test-Path -path $SavePath )) # create it if not existing
{Try { New-Item $SavePath -type directory | out-null }
Catch [system.exception]{
Write-Error "error while creating '$SavePath' $_"
return
}
}
# tell the scripter object where to write it, and make sure it is actually writeable
if ($_.schema)
{
$Filename = "$($_.schema -replace '[\\\/\:\.]','-').$($_.name -replace '[\\\/\:\.\ ]','-').sql";
}
else
{
$Filename = "$($_.name -replace '[\\\/\:\.]','-').sql";
}
$scripter.Options.FileName = "$SavePath\$Filename"
$scripter.Options.FileName #print it out so we know what is being done
$UrnCollection = new-object ('Microsoft.SqlServer.Management.Smo.urnCollection')
$URNCollection.add($_.urn)
############################
# TFS code for each object #
############################
#Use TF to see if the object exists on our TFS server
"checking to see if object exists"
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF dir $/Randolph/$ServerNameClean/$Database/$($_.DatabaseObjectTypes)/$Filename 2>&1"
# Running all the TF commands this way as per Simon Ejsing to ignore the error state and capture the actual error message.
# http://stackoverflow.com/questions/2095088/error-when-calling-3rd-party-executable-from-powershell-when-using-an-ide
if ($tf -like "No items match*" -or $tf -like "*is not found or not supported*")
{
"no items match; scripting out and ADDing to TFS"
try
{
$scripter.script($URNCollection)
}
Catch
{
"Error Message trying to script out $SavePath\$Filename"
$error[0].Exception
}
"script done"
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF add /noprompt ""$/Randolph/$ServerNameClean/$Database/$($_.DatabaseObjectTypes)/$Filename"" 2>&1"
$tf
#mdb 2013/11/07 only do ONE checkin at the end if we are doing an entire database; all will have the same comment
if ($ObjectToScript -ne '')
{
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF checkin /author:$author /comment:""$comment"" /noprompt 2>&1"
$tf
}
}
else
{
"item exists; get, check out, script to override, check in"
#noprompt causes it to crash, virtually every time
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF get ""$/Randolph/$ServerNameClean/$Database/$($_.DatabaseObjectTypes)/$Filename"" 2>&1"
$tf
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF checkout ""$/Randolph/$ServerNameClean/$Database/$($_.DatabaseObjectTypes)/$Filename"" 2>&1"
$tf
#Delete file before scripting; we have seen permission issues.
if(Test-Path -Path $scripter.options.filename)
{
try
{
remove-item "$SavePath\$Filename" -force
}
catch
{
$error[0].Exception
}
}
try
{
$scripter.script($URNCollection)
}
Catch
{
"Error Message trying to script out $SavePath\$Filename"
$error[0].Exception
}
#mdb 2013/12/03 making this part only run if it is a specific object; that way we can rerun an entire database
if ($ObjectToScript -ne '')
{
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF checkin /author:$author /noprompt /comment:""$comment"" ""$/Randolph/$ServerNameClean/$Database/$($_.DatabaseObjectTypes)/$Filename"" 2>&1"
$tf
}
}
}
}
#If it is a mass add or a database-specific, CHECKIN now.
if ($ObjectToScript -eq '')
{
"final mass checkin"
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF checkin /author:$author /comment:""$comment"" /noprompt 2>&1"
#$tf we do not need this one here because it will be shown below
#$tf
"mass checkin done"
}
"--------------------------------------------------------------------------"
#if the checkin failed, UNDO so the next one does not make it worse.
#The next checkin would probably fix it, but I have had to go back and manually undo. Not fun.
#mdb 20131107 If there were any items to check in, get the results. Throws an error otherwise.
#using Stej code to verify the variable exists; if no objects, $tf is never set, so it bombs here
# http://stackoverflow.com/questions/3159949/in-powershell-how-do-i-test-whether-or-not-a-specific-variable-exists-in-global
if (Test-Path variable:local:tf)
{
if ($tf -like "Changeset * checked in." -or $tf -like "There are no remaining changes to check in.")
{
#mdb 20131107 If there were any items to check in, get the results. Throws an error otherwise.
if ((Test-Path variable:local:d) -and $ObjectToScript -eq '')
{
$tf
}
}
else
{
"changes not made - errorlog follows"
$tf
"================================UNDO BEGINS=================================="
$tf = &cmd /c "c:\TFS_CLI\App_2013\TF undo ""$/Randolph/$ServerNameClean/$Database/*.*"" /recursive 2>&1"
$tf
"=================================UNDO ENDS==================================="
}
}
else
{
"No objects found, nothing done"
}
#TFS rollback code, should look something like this
#c:\TFS_CLI\App_2013\TF undo $/Randolph/$ServerNameClean/VOAgent/DatabaseRole/*.*
#No files checked in due to conflicting changes. These conflicting changes have been automatically resolved. Please try the check-in again.
#rolling back code: c:\TFS_CLI\App_2013\TF undo $/Randolph/cm-01/VOAgent/DatabaseRole/*.*
#for some reason "override:" did not work.
#C:\TFS_CLI\EN_Workspace\server\file.sql
#item exists; get, check out, script to override, check in
#All files are up to date.


Friday, September 5, 2014

[Event Notifications] SQLWatchdog! (Coming to KC in a week)

I'm pleased as punch - presenting at SQL Saturday #300, Kansas City, in a week.

The title is "Master of All I Survey", because I wanted a grandiose title that would get butts in seats.  I was the target audience during 2 different PASSes where it was presented, yet didn't go to it because the title/abstract didn't sound relevant.  Sad to say, since I was the target audience!  Queue the "better" title. 

Alas, I seem to be the only one in love with it, so I changed it to "SQLWatchdog", based on a coworker's suggestion (HI, JASON).  It gets the point across - a Watchdog for SQL. 

If you look in this blog, you'll see it referenced as a couple of things, but the tag I try to use is "Event Notifications", so you can find them at http://thebakingdba.blogspot.com/search/label/Event%20Notifications


And as the KC SQLSat group appears to know how to have a good time, I'm ALSO going to fill my belly with BBQ, share it with the world after hitting roller coasters, and even sneak in a game.  A wonderful weekend to be sure.

See you there!
http://www.sqlsaturday.com/300/eventhome.aspx

Thursday, August 28, 2014

[Event Notifications] saves the day again...

Got the morning ERRORLOG email courtesy of EN.  And I got several hundred messages from ERRORLOG.  During a patch, the Agent XPs were disabled but not reenabled.  So the Agent service was up, but didn't do anything.

Ironically enough, it wasn't just the ERRORLOGs that saved me - it was SQL Sentry, which hit (and failed) the XPs once a minute.  Which, yay, caused an ERRORLOG message, but didn't actually TELL me it wasn't working.  (I've submitted a bug fix for that).

Just another of those "my god, how do I live without this" posts.

Reminder:
ALL my EN code is in this share:
https://drive.google.com/folderview?id=0B3a287PS_UJIcnY3Q1pvX3p1eEE&usp=sharing

Additionally, my Gists (source code of the code I use in production) are at:
https://gist.github.com/mbourgon

Wednesday, August 27, 2014

[SQLSaturday] Thanks for coming to see me at Oklahoma City!

Thanks again for all the generous hospitality, and the 28(!) people who came to my session.  It seemed like everybody enjoyed it!  I got a bunch of good reviews, and a couple suggestions I'll implement for SQL Saturday 300 (This. Is. Kansas City!) but feel free to let me know what I could have done to improve things - I know the room was warm and I saw people flagging near the end.

Matt Brimer did an amazing job, and I will definitely return.  Great venue, most of the speakers were good, food was awesome.  All around, a fantastic SQLSat.

Slides and everything are available off the SQL Saturday Web site, at:
http://www.sqlsaturday.com/viewsession.aspx?sat=309&sessionid=20771

Note that that's not only all my slides, but the scripts as well.  I have a newer, complete version of the Repository - I'll have that put up on here in the next day or so (updated: https://drive.google.com/folderview?id=0B3a287PS_UJIcnY3Q1pvX3p1eEE&usp=sharing , see "EN Full Repository Setup").  That should be everything required to run the code on your servers.  Installers, new databases, everything.

Michael

Tuesday, August 19, 2014

[Interesting] Linked Server is cached for a session

Since I spent 20 minutes trying to diagnose it this morning...

Say you have a session open.  That session invokes a query that hits a linked server.

Now change the linked server.  Drop and recreate, point it at a different server.

New sessions get the new definition, but existing sessions use the old definition.

Monday, August 18, 2014

[Presenting] Last minute - SQLWatchdog at the Fort Worth SQL Server User Group, Wednesday, August 20th

Just had a last-minute invite, and it seemed an awesome opportunity to present & hone it one more time before my forthcoming SQL Saturday presentations (OKC this weekend, KC next month)

http://www.fwssug.com/ for more details.  It'll be Wednesday, August 20th, 2014, at 7pm.

Yes, this is the presentation I did there almost 2 years ago, expanded, fleshed out, and with Cat Pics. Because if there's one thing a presentation needs, it's Cat Pics.

See you there!



Monday, July 21, 2014

[Presenting] Double your TheBakingDBA exposure! Come to the Kansas City SQL Saturday!

Just got the email - I'm presenting at SQL Saturday 300, Kansas City!

Despite the name, this is the same presentation I've been giving (aka SQL Watchdog) - I submitted this before changing my title.  We'll be adding a few other features as well, based on past suggestions from both audiences as well as my own internal folks.

It sounds like it'll be awesome - Bob Ward is speaking, they're going to have a BBQ Crawl on Friday, sounds like there's an event at the ballpark, there's a group going to a local theme park to ride the coasters...

Man.  Cannot wait.

http://www.sqlsaturday.com/300/eventhome.aspx

Sep 13 2014 
Cerner Corporation's Riverport Campus
6711 NE Birmingham Rd, Kansas City, MO 64117

Tuesday, July 8, 2014

[Presenting] Woot! I'm presenting at SQL Saturday 309, Oklahoma City!

Pleased and proud to have been selected to present at #SQLSatOKC, aka SQL Saturday 309, aka SQL Sat Oklahoma City.  Doing my presentation (continually updated) on Event Notifications, and capturing DDL changes (along with other fun stuff like Deadlocks and Errorlogs) in real time, and what you can do with that.

As always, I'm trying to figure out how to include all the details necessary to do it, not TOO many details, and add some fun stories on "heh, look how I saved the day". 

While actually remembering to breathe - a lot of details, and honestly, I get excited about presenting it.  The goal is to make the passion shine through, without seeming like a Spaz or newbie speaker. 

SQL Saturday Oklahoma City:
Aug 23 2014 at 13301 South Pennsylvania Avenue, OKC, OK 73170. 

http://www.sqlsaturday.com/309/eventhome.aspx

Monday, June 30, 2014

[Presenting] PASS Summit 2014 - Bummed and Congrats!

Got back from vacation on Tuesday.  Check twitter sometime that morning - and lo and behold, the PASS Summit 2014 sessions are picked!  Check email - nothing.  Check the schedule - well, I guess "not picked"  doesn't deserve an email.  Thanks.  Check on twitter (because honestly, a "why" would be nice) - no, I should have gotten something.  Long story short - 4 hours later, I get the email.  I already knew from looking at the schedule that I wasn't picked - which was a kick in the pants, having presented it 5 times at this point.

However, a silver lining - it appears I was rejected because someone else was presenting on something similar.  Indeed!  So a hearty congrats to Colleen Morrow!  While I personally think EN curb-stomps SQL Audit, I'm glad to see the subject of checking for code changes in the environment will be presented at PASS.  Colleen's already shown (via her chapter in Tribal SQL) that she knows quite a bit about all the options, and writes passionately on the subject, so I have no doubt it'll be a great session.  Wish I could see it.  : )


Friday, May 30, 2014

[WAT] more screwy date conversion

So, let's convert some dates

DECLARE @my_month DATETIME = '20140530'
SELECT SUBSTRING(CAST(@my_month AS VARCHAR(8)), 5, 2) + '/01/' + SUBSTRING(CAST(@my_month AS VARCHAR(8)), 1, 4)

Returns:
30/01/May

But what if we set that date to an Int?
DECLARE @my_month int = '20140530'
SELECT SUBSTRING(CAST(@my_month AS VARCHAR(8)), 5, 2) + '/01/' + SUBSTRING(CAST(@my_month AS VARCHAR(8)), 1, 4)

Returns:
05/01/2014


Monday, May 12, 2014

SQL Saturday 308 - SQL Watchdog - scripts and slide deck

Someone mentioned a problem with the scripts being corrupt on the SQL Saturday website.  Here's a copy of my most recent presentation.  All the scripts, including the repository creation scripts (zip-within-the-zip). 

Holler if you have any questions, thanks again to Nancy and Allen for allowing me to present and treating us like rockstars, and thanks to everyone who attended SQL Saturday Houston!

TBD

https://drive.google.com/file/d/0B3a287PS_UJIRDJBNWEtNVBJY0E/edit?usp=sharing

and the whole folder of goodies (old versions, warts and all) is available at:
https://drive.google.com/folderview?id=0B3a287PS_UJIcnY3Q1pvX3p1eEE&usp=sharing

Saturday, May 10, 2014

[Event Notifications] Hekaton prevents EN?!

Nooooooooooooo...

http://msdn.microsoft.com/en-us/library/dn246937(v=sql.120).aspx

Memory-optimized tables and natively compiled stored procedures cannot be created or dropped if there is a server or database event notification for that DDL operation. Remove the server and database event notifications on CREATE TABLE or DROP TABLE and CREATE PROCEDURE or DROP PROCEDURE.

What happens if you attempt to do it?

 Msg 10794, Level 16, State 124, Line 1
The feature 'EVENT NOTIFICATION' is not supported with memory optimized tables.



Why is this a big deal?  Because, honestly, I'm scared that if they're not supporting things like this, that EN is on its way out.  I use it extensively (see my presentations on it, coming soon to a SQL Saturday near you), and it's IMHO a must-have feature.  It also means that whenever we want to do something with Hekaton, we have to drop the EN, do the Hekaton-based changes, then we're less likely to use Hekaton or EN.  

Please vote on my Connect item:

https://connect.microsoft.com/SQLServer/feedback/details/874185/event-notification-please-make-in-memory-oltp-tables-natively-compiled-sps-work-with-en

Wednesday, April 23, 2014

[Event Notifications] Using Certificates for Authentication instead of AD/Kerberos

My normal environment that I run is pretty simple.  One trusted domain.  So security for Event Notifications is pretty simple too - I just use Kerberos.  It's simple, easy to read, nothing fancy involved, and I wanted to make sure that people weren't going to look at the prerequisites and immediately say "I don't want to deal with keys and certificates and all that - so I'm not going to do it". 

A former coworker isn't as lucky - he has multiple, untrusted domains.  So he had to get it working.  And much to my surprise, it was considerably easier than I'd thought, from reading about Service Broker certificates (literally, one of the demos in the book I read used _7_ certificates).  Yet again, I dig what MS has done with EN, since it's a simple use-case of Service Broker.


Note that we're doing encryption, creating master keys and certificates.  You may need these keys in order to restore the database - that's what I've heard over the years.  But from what it sounds like, technically you don't need it as nothing is encrypted.  However, I'd be really paranoid and test this out on a test server first, specifically the whole "restore the databases without the keys" part.. 

TL;DR -  Script is below, test restores before putting in production.  You'll create 1 key & 1 cert on your Repository, and then on each monitored server you'll create 1 key & 1 cert, swapping keys with the Repository. 

Many thanks to James for doing all the hard work on this!

--For the repository server I only have one cert and one key.
--For the new servers I have to create a new certificate and key for each one.
-------------------------------------------------------------
--Part 1, Run on new server adding to Event Notifications--
-------------------------------------------------------------
USE master
GO
--Creating master key and an user which will be used by the certificate
CREATE MASTER KEY ENCRYPTION BY PASSWORD = N'';
--Create certificate that will be used for encrypting messages.
CREATE CERTIFICATE EN_Cert_ServerName
WITH SUBJECT = N''
,EXPIRY_DATE = '';
--shouldn't be needed, according to James...
-- "This is not needed, since the Master Key is doing the encryption and not the Symmetric Key."
--CREATE SYMMETRIC KEY EN_SymmetricKey_ServerName
--WITH ALGORITHM = AES_256
--ENCRYPTION BY CERTIFICATE EN_Cert_ServerName
--SELECT * FROM sys.symmetric_keys
--SELECT * FROM sys.certificates
--Backup certificate
BACKUP CERTIFICATE EN_Cert_ServerName --add server name
TO FILE = ''
WITH PRIVATE KEY (FILE = '',
ENCRYPTION BY PASSWORD = '');
--Create end point
CREATE ENDPOINT EN_Endpoint STATE = STARTED AS TCP ( LISTENER_PORT = 4022 )
FOR SERVICE_BROKER ( AUTHENTICATION = CERTIFICATE EN_Cert_ServerName );
--check endpoints
SELECT *
FROM sys.endpoints
--create login to be used by Service Broker
CREATE LOGIN EN_User WITH PASSWORD = N'';
CREATE USER EN_User FOR LOGIN EN_User;
--Grant permissions on endpoint to login
GRANT CONNECT ON ENDPOINT::EN_Endpoint TO EN_User;
--Get SB GUID for msdb for 2nd script
SELECT service_broker_guid
FROM sys.databases
WHERE name = 'msdb'
--get IP address if needed
EXEC xp_cmdshell 'ipconfig'
-------------------------------------------
--Part 2, run on EN repository server--
-------------------------------------------
USE master
GO
--Creating master key and an user which will be used by the certificate
CREATE MASTER KEY ENCRYPTION BY PASSWORD = N'';
--Create certificate that will be used for encrypting messages.
CREATE CERTIFICATE EN_Rep_Cert_ServerName
WITH SUBJECT = N'Certificate For EN Repository server'
,EXPIRY_DATE = '1/1/2199';
CREATE SYMMETRIC KEY EN_Rep_SymmetricKey_ServerName
WITH ALGORITHM = AES_256
ENCRYPTION BY CERTIFICATE EN_Rep_Cert_ServerName
BACKUP CERTIFICATE EN_Rep_Cert_ServerName
TO FILE = N''
WITH PRIVATE KEY (FILE = '',
ENCRYPTION BY PASSWORD = '');
ALTER ENDPOINT EN_Endpoint STATE = STARTED AS TCP ( LISTENER_PORT = 4022 )
FOR SERVICE_BROKER ( AUTHENTICATION = WINDOWS CERTIFICATE EN_Rep_Cert_ServerName, ENCRYPTION = SUPPORTED )
CREATE LOGIN EN_User WITH PASSWORD = N'';
CREATE USER EN_User FOR LOGIN EN_User;
GRANT CONNECT ON ENDPOINT::EN_Endpoint TO EN_User;
--Get SB GUID for 1st script
SELECT service_broker_guid
FROM sys.databases
WHERE name = 'EventNotificationRec'
--get IP address if needed
EXEC xp_cmdshell 'ipconfig'
---------------------------------------------------------------------------------
--Copy over backed up certificates between EN repository server and new server.--
---------------------------------------------------------------------------------
--------------------------------------------------------------------------------------------------------------
--Part 3 - run on new server.
--------------------------------------------------------------------------------------------------------------
--Restore target certificate copied from the EN repository server, add EN rep servername
CREATE CERTIFICATE EN_Cert_Repository_ServerName --add server name
AUTHORIZATION EN_User
FROM FILE = N''
WITH PRIVATE KEY (FILE = '',
DECRYPTION BY PASSWORD = '')
USE msdb
go
CREATE ROUTE [ENAudit_DestinationRoute]
WITH SERVICE_NAME = 'ENAudit_Service',
BROKER_INSTANCE = '', --Service Broker GUID
ADDRESS = 'TCP://IP or ServerName:4022' --add IP address or server name
CREATE EVENT NOTIFICATION [ENAudit_ServerDDLEvents] ON SERVER
FOR DDL_SERVER_LEVEL_EVENTS,DDL_TABLE_EVENTS,DDL_VIEW_EVENTS,DDL_INDEX_EVENTS,DDL_SYNONYM_EVENTS,DDL_FUNCTION_EVENTS,DDL_PROCEDURE_EVENTS,DDL_TRIGGER_EVENTS,DDL_EVENT_NOTIFICATION_EVENTS,DDL_ASSEMBLY_EVENTS,DDL_TYPE_EVENTS,DDL_DATABASE_SECURITY_EVENTS,DDL_SSB_EVENTS,DDL_XML_SCHEMA_COLLECTION_EVENTS,DDL_PARTITION_EVENTS,AUDIT_LOGIN_FAILED,DEADLOCK_GRAPH
--and any other events you want to watch.
TO SERVICE 'ENAudit_Service', ''; --Service Broker GUID
SELECT * FROM sys.server_event_notifications
--test by creating and dropping table
IF OBJECT_ID('master.dbo.z_delete_ENtest') IS NOT NULL
DROP TABLE master.dbo.z_delete_ENtest
CREATE TABLE master.dbo.z_delete_ENtest ( id INT IDENTITY )
DROP TABLE master.dbo.z_delete_ENtest
--verify transmission queue is empty
SELECT *
FROM msdb.sys.transmission_queue
SELECT * FROM sys.certificates
------------------------------------------
--Part 4 - run on EN Repository Server--
------------------------------------------
--Restore certificate from new server added to EN
CREATE CERTIFICATE EN_Cert_ServerName
AUTHORIZATION EN_User
FROM FILE = N''
WITH PRIVATE KEY (FILE = '',
DECRYPTION BY PASSWORD = '')
USE EventNotificationRec
go
CREATE ROUTE ENAudit_ReturnRoute_ServerName
WITH SERVICE_NAME = 'http://schemas.microsoft.com/SQL/Notifications/EventNotificationService',
BROKER_INSTANCE = '', ADDRESS = 'TCP://IP or ServerName:4022';
SELECT * FROM sys.routes
SELECT TOP 10 * FROM EventNotificationRec..ENAudit_Events ORDER BY id DESC
SELECT * FROM master.sys.certificates WHERE name LIKE 'EN_%'


Tuesday, April 15, 2014

#SqlSat308 I'm pleased to present in Houston on May 10th, 2014. "SQL Watchdog"


Woohoo!










I'm honored to have been chosen to present at SQL Saturday #308, Houston, on May 10th 2014.

This will be my Event Notifications presentation, more refined, running on SQL Server 2005 & SQL Server 2014.  We'll cover concepts all the way through a multi-server environment and more.  You'll see the advantages of running Event Notification - not only can you know within a second that code was run that might compromise or break your enviroment, but you can do other things with it, from TFS checkin of production changes (for later issues), ERRORLOG monitoring, killing unauthorized backups and more.

Naturally, in addition to me, there's a ton of big SQL people that'll be there.  Aaron Bertrand, Tim Costello, Tim Mitchell, John Sterrett, Denny Cherry, Lori Edwards, Chris Bell and more!  Sixty sessions in all!


http://www.sqlsaturday.com/308/eventhome.aspx

Wednesday, March 19, 2014

DB_Ops Evolved - Be The Proactive Hero

Since I love to share...

https://docs.google.com/presentation/d/1EtLdhoewsmCsZ28PbqR0ELzVzlHlTAt-CCenRspUvtI/edit?usp=sharing

Feel free to ask any questions.

Michael

Friday, February 14, 2014

[Wat] Leading Dots work in SQL queries? Well, that's some interesting syntax there, Lou.

Wound up looking at a failing piece of code.  The first thing was how it was called. 

EXEC .dbo.the_sp_name

...but it turns out, that part works.  As does this query: SELECT * FROM .sys.tables .  Had never seen that before.

Thursday, January 30, 2014

[Presenting] Changing session title - what should I call my EN presentation?

So, SQL Saturday #271 (Albuquerque 2014) is over.  It was a blast.  We got to drive through Amarillo, see the Cadillac Ranch, get our Kicks On Route 66, etc, etc.  The venue was awesome, they treated us like kings, and it looked like a good turnout. 

I felt I did pretty good on my presentation, though there are a few things I need to tweak.

Interesting, though, while everybody who went loved it, it looks like a decent number of people didn't come because of the title.  My current title is (take deep breath):
"Master of All I Survey - track code changes and more across your environment".


Part of the reason for that overly-grandiose title was because, while I am definitely the target audience for the presentation, I didn't see either of the 2 times it was presented at PASS - and I was at both!  But since my title isn't working either, what should I change it to?

Here are the titles of the other two events
  • "Auditing Events in SQL Server 2005" (PASS 2005)
  • "Using Event Notifications in SQL Server 2005/2008"  (PASS 2011)

The abstracts (and presentations) are both great - but IMHO the titles could be stronger. I saw "auditing events" and said "I don't do auditing"; for "Using Event Notifications" thought "not familiar with those, not sure what it is, read abstract later".


I found some others online:
Monitoring and Recording DDL changes on SQL 2005 (NARC)
Auditing DDL Changes in SQL Server databases


Current ideas.
Evil Overlord - know when your minions are changing SQL code/tables/etc in production
Who changed that code? Finding production changes in real-time using Event Notifications.
Real Time Spy - being proactive by knowing when code changes in production
DB_Ops Evolved - Be the Proactive Hero by knowing when things change!
DB_Ops Evolved - be proactive by knowing when DDL changes occur

The question becomes - what do I focus on?  I thought I did pretty good - catchy bit, then tells you what it does (track code changes) and that it focused on multiple machines. 

What else could I focus on? 
  • Real-time - I really think this is an big benefit over the other trace methods.
  • Trace Events like Failed Logins, ERRORLOG, Deadlocks, etc
  • Being Proactive  
  • SOX-type stuff - failed logins, code changes, etc.
 I'm working on a couple presentations under the Aegis of "DB_Ops Evolved" to introduce advanced concepts and a healthy respect for the engine), hence that last title.

I'm wide fricking open, here.  And at a loss.