Thursday, December 10, 2015

[T-SQL] Basic Pivot automation script

There's a guy out there who has a great dynamic pivot CLR.  This is a much simpler version.  I _think_ this is original to me.  I've been using it in various projects on this blog, but I don't see where I posted just this code.

Say you have records from your monitor that happens to Track EventLogs.
We'll call it EventLog_Tracking for argument's sake (hint hint http://thebakingdba.blogspot.com/2015/05/powershell-eventlogtracking-capturing.html) and want to look at trends over time.

First, you'd need a grouped set.

SELECT sourcename, CONVERT(DATE, TimeGenerated) AS the_date, COUNT(*) AS the_count
INTO #temp
FROM EventLog_201512
WHERE sourcename LIKE 'microsoft-windows%'
GROUP BY SourceName, CONVERT(DATE,TimeGenerated)
ORDER BY CONVERT(DATE,TimeGenerated), SourceName





And you want it to look like...
 


Here's the base pivot code.  For this example, replace down with "the_date" and across with "sourcename"

If you want a name or something instead of the number, then do something like "max(yourname)" instead of "sum(the_count)"


[Hey Kids] @@servername and BCP and ultra-long statements.

Hear me now and listen to me later.

When you take that ultra-complex 6000 character string for BCP and wonder why it's getting truncated to 4000 (which SSMS has problems showing anyhow), remember that @@servername is sysname, which is an alias for Nvarchar(256)... and adding it to a varchar converts the whole string to nvarchar. 

I'll never get those hours of my life back.  Be smarter than me.

Wednesday, December 9, 2015

[System_Health] pulling all the data from the saved-to-disk files

TL;DR - imports the 3-4 system_health files that are automatically saved to disk.

I had a problem this week with a deadlock.  I went looking in the ring buffer for the system_health session, but it had already aged out. 

So, remembering that the system_health session has a file target, went looking for it (Jason Brimhall has code for it - thanks!).  The file went back further but not all the way.  Since the event definition says to keep a couple of older files, I use the current file to find the path, list all the files in that path (xp_dirtree; doesn't require xp_cmdshell), then walk through and import all of them. 

You could simplify it a lot and use fn_xe_file_target_read_file and tell it to read all files like system_health*0.xel in the path, but with my luck those ending numbers will sometimes end in a non-0 value.  I use the 0 in the mask because I make local copies of the files occasionally, so that weeds out " - Copy".

When I get some free time, I'll get my current sp_server_diagnostics/system_health parsers to work with it (see http://thebakingdba.blogspot.com/2015/04/systemhealth-2-dynamic-and-full-parsers.html and http://thebakingdba.blogspot.com/2012/12/spserverdiageventparser-make.html).  I started on that, but it's cartesianing on me and I don't have time to fix it right now. 

Enjoy!








   

Wednesday, October 21, 2015

[Presenting] WMI and your sanity

Here's my current slide deck, the link below is for the SQL Saturday Kansas City which has the code.




http://www.sqlpass.org/EventDownload.aspx?suid=10852

Thursday, October 8, 2015

[Tools] The various flavors of SSMS I have on my box right now

First, yes, there are reasons to have the various versions. 2005 can open DTS, for instance.

But this is what I get when I use "choose default program" with .SQL files. Big hugs to Microsoft for never including the version number of SSMS in there.  It makes it lots of fun when I wind up accidentally resetting it.

Putting it in here so hopefully I won't have to do this 10-minute exercise next time.







Top left: 2005
Middle left:  2014
Bottom left (highlighted): 2016 CTP2
Middle right: 2012
Bottom right: 2008 R2

Monday, October 5, 2015

[Presenting] So far, so good...

Wow! I just realized that this past weekend makes two years that I've been presenting at SQL Saturdays!

First, this past weekend.  It was a blast, and reminds me why I love doing this.  The drive to Kansas City was long (longer on the way back, of course), but the BBQ crawl was stellar, the location for everything was, in all seriousness, crazy perfect (the hotel, BBQ crawl meetup, speaker dinner, and after-parties were all within a 5 minute walk of each other, and it would've been trivial to catch a ride to the venue), the venue was spectacular (that's a cool college with some really modern presentation facilities), the speaker selection was great, and the locals were amazing (special shout out to Dave for taking us around on Friday and showing us all SORTS of cool stuff we wouldn't have known about otherwise - I'm bringing my family back to do some of that).

Bill, Megan, & company were fantastic hosts, and since so many speakers came to the BBQ crawl, I was able to talk at length with many of them, which actually meant that I got to talk in depth about a ton of topics, learned some other cool things, and even changed which sessions I was going to.  What's the saying?  "Don't be the smartest guy in the room".  I most assuredly wasn't, and so I hope/think that some of that rubbed off on me.  Plus, not to be a putz or schmuck, but it's nice when you can talk to new people who live & breathe this stuff - it's refreshing and it reinvigorates me.  Finding out that you are not the only person doing X, and now you have cards from others who also do it and can help you.  It's amazing.  (That Service Broker class, too - I'd never seen so many people using it)

After-party - now that I've been to #SQLKaraoke, I see why people like it.  When the person can sing it's really good, and when it doesn't... it's still interesting.  Mostly, it worked. I'd swear there were a couple of ringers in there, except I saw them all at SQLSat.



Lessons learned: if I could recommend two things to any of the other SQL Saturday groups:
  1. Put everything nearby.  I don't know how many cities can do it, but being able to walk out of the hotel and do everything except be at the venue was freaking great.  Yes, ideally you want the hotel next to the venue, but I'd pass on that in order to have all the other associated things (dinner, day-before stuff, afterparty) all near each other.  The Westport neighborhood in Kansas City fit that to a T.  I could easily & safely walk to several really good restaurants, & there was plenty of nightlife nearby.  Runner up goes to Denver for having the after-party at the venue itself - not something you can do most places, but worked really well. (Though I missed the after-after party... and given what happened on Saturday night, I'm _really_ interested in seeing next years'.)
  2. Extended pre-time/cool events.  I'll be honest, the BBQ crawl was a good reason to come.  And, maybe surprisingly, not just because of the food.  It was because I got to sit and talk to Russ at length about his ETL-with-Hekaton.  I got to chat with Brandon about the sucky parts of Table Partitioning.  I got to shoot the bull with Bill about Cleartrace and Event Notifications.  Bugging people about how to get a MVP (hey, if you know someone...).  Hanging out with Andre and talking some of the things he was doing in his Powershell session - and then finding out he'd downloaded my deck for the same reason.  Zipping around town with Dave learning about great coffee places (PTs!), Always Encrypted, the plan cache, and the architecture and history of KC.  You can't get that out of a normal 2-3 hour dinner, much though I wish you could.  In this case, popping between multiple restaurants worked REALLY well, as it meant you weren't sitting next to the same people all afternoon, and so you got exposed to a ton of different conversations and topics.   Just wish I knew how you could do that in more cities.  Plus, as an out-of-towner, it was cool actually seeing more than just the venue itself.  It's easy to go around and find cool stuff during the 3-5 days of PASS.  It's much harder when you get in at 9pm on Thursday after a 9-hour drive, and are leaving early Sunday.

Now on to the presentation (I mean, that's why I went, right?): I think it went really well.  Thanks to Jason & Zane who were watching, I learned some changes I need to make, places to cover in more depth (or add demos), and parts to cover less.  I still massively dig the topic, and want people to do more with it, which means I need to work on adding some more PRACTICAL use cases.  But I find it funny/awesome that there are a couple of "throw-away" bits (powershell unrestricted, power plans, etc) that were probably more useful to some than the main topic.  Which reminds me that I need to read through the reviews.


So, two years in.
Where have I been so far?
  • Dallas 2013
  • Albuquerque 2014
  • Houston 2014
  • OKC 2014
  • KC 2014
  • DC 2014
  • Austin 2015
  • KC 2015
  • DBA Virtual Chapter
  • PASS 24 HOP 2015
  • FWSSUG (3 times thus far, I think)  
  • Denver 2015 (attendee only, but 12 hour drive means I'm at least mentioning it)
What's left/next?
I want to present at PASS on EN, so badly I can taste it.  Doing 24 HOP was one of those "holy crap, I'm there" moments.  (So was going and having dinner with a bunch of the other speakers on Saturday - walking up to a table full of people wearing the speaker shirts, and realizing I both Fit In and was One Of Them, was amazing).  I need to come up with more presentations.  I know a decent number of presenters who write an abstract, and then come up with a topic.  My presentations, for better or worse, are because I have to solve a particular problem - and then I write a post-mortem that becomes my presentation.  Next couple topics I have planned: "SAVE ALL THE THINGS" which will probably encompass my EPR, WMI, EN, and probably CDC or CT.  The one after that is probably going to be cloud-based.  But I think an easy one would just be the simple things that most people don't have set right (and now that I think about it, I saw this type of thing in DC in 2014, by Jason Brimhall).  That one, though, I might convince one of my coworkers to do.

What else?
I'd like to thank all the respective teams for having me, my awesome coworkers for coming with me on these (truly, this would've been impossible without them), my other coworkers for covering while I'm out, my boss for letting me attend, and my wife for not just letting, but actually encouraging me to go.

Thursday, September 24, 2015

[Presenting] 9 days until WMI at #SQLSatKC !







Reminder: I will be presenting on SQL Saturday Kansas City on October 3rd, at Rockhurst University. 

I'm doing a new presentation!  (Which means my coworkers get to sleep through me doing a Lunch & Learn).  Eventlogs, WMI, WQL, and why you as a SQL Server DBA should care.  We'll start with Event Logs and go down the rabbit hole of WMI.  We'll cover Eventing, WQL alerts inside SQL Server, and more.  You'll find new abilities inside Windows & SQL Server you didn't know existed.


It looks like it ought to be another stellar event - the reason I'm driving the 12 hours is because of how well last year's went (plus, hey, BBQ crawl). 

And there's a ton of other awesome speakers, even if you don't see mine.  Kathi Kellenberger talking about window function performance, Bill Graziano talking about modern high-speed OLTP, Jason Horner on Dimensional Design, Bill Fellows on BIML, Wendy Pastrick on indexes... and another 29 sessions!

Wednesday, September 16, 2015

Monday, September 7, 2015

[T-SQL Tuesday] It's time to discuss "Strategies for Managing An Enterprise"

T-SQL Tuesday



Jen McCown is hosting this month's TSQL Tuesday . The topic for T-SQL Tuesday #70 is "Strategies for Managing An Enterprise".

That's easy: relentlessly use automation to know & improve your environment.  
Thanks, I'll be here all week.  Try the veal.  

In all seriousness, though, that's been my career - being a Lazy DBA (spending far too many hours to accomplish that) via automation.  I actually got fired from a job because they didn't know what I did - it all Just Worked, and I wasn't running around like things were on fire, so I must not be doing anything important.  (Yes, seriously, and the skills I gained meant that the next gig was a LOT more fun and a lot better overall).  It honestly doesn't matter what you use.  Over the years I've used batch files, Perl, shell scripts, SSIS, PowerShell, T-SQL, other peoples' tools... anything and everything.  The tools don't matter -it's the mindset.  Why are you spending 10 minutes a day doing X?  Spend an hour automating it, and then take those 10 free minutes a day and find more things to automate.  Seriously, it's a win-win.  You learn new skills, you gain experience with different topics, you better know your environment (why won't THIS server return the data the same way?), it makes your job less repetitive more interesting, and therefore life is more fun.  

There are even a couple different XKCDs that cover different aspects:
(though this can hold true https://xkcd.com/1319/ )

In fact, it's the question I try to ask when interviewing: what have you automated lately.  Why?  How?  Heck, even if it's just automating server installs, it's a game-changer.  Not just for speed, but for consistency.  No more wondering if that setting was changed on ALL your servers, for instance. 


Now, fortunately for you, there are plenty of people on the Internet who relentless automate things, and have the graciousness/convinction to share.  So what things could/should you already be automating for free?
  • SP_Blitz - written by Brent Ozar and associates, it's a one-stop shop for "you're doing it wrong".  Runs quickly and easily, and tells you what it thinks you're doing wrong.  
  • Cleartrace - written by Bill Graziano, this tool has a command-line option, so you can import traces into a database automatically.  Why do that?  Well, say you have a customer-facing site, so you can look historically and see what queries y
  • SQL Server Maintenance Solution - throw away your crappy MS-built "maintenance plans" and get something with some OOMPH.  Backup all your databases simply, automatically reindex/defragment, and do integrity checks.  Just customize the jobs (and, to be honest, most people could just set up schedules and be done - that's how good the code is)
How can you go wrong?  


Finally, I'm not above self-promotion.  And hey, more free tools.  What automation do I have on this blog? 
1) Event Notifications: http://thebakingdba.blogspot.com/search?q=event+notifications
What's this do?  Tells you what code has changed in your environment, how, by who, and when.  I love this ability - we've used it from everything from tracking down failed logins, to checking code into TFS so that devs can see what production _actually_ looks like.  (Oh, and it's a native SQL Server feature, works on anything from SQL Server 2005 Standard Edition on up... and it's free!).  I use this almost every single freaking week.  Which is either awesome or sad - but I'm going with AWESOME.  It improved my skills by at least 1 point on a 10-point scale, both by getting it set up, and because it means you're not playing detective trying to figure out what has changed.  Cannot say enough good things about this.

2) The EPR (Extensible Powershell Repository).  http://thebakingdba.blogspot.com/2013/04/servers-extensible-powershell.html
How's this?  Create a T-SQL/Powershell script, drop it in a folder, and within a few minutes you have the results from all your servers, saved to a table.  Bonus: options to keep history.  We've been using one of the reports lately to decommission processes.  (though I have a couple dozen in production, all of which help with specific cases).  The best part of it, honestly, is that you wonder "huh, I wonder how bad a problem this is", write the script in 10 minutes, drop it in the folder, and come back in a week/month and see how things look.  And then realize that knowing more helps you have a smoother environment.

3) Eventlog_Tracking: http://thebakingdba.blogspot.com/2015/05/powershell-eventlogtracking-capturing.html
The latest, and I'm presenting on it in Kansas City on October 3rd!  (Now I just have to finish writing it...)  Using PowerShell and WMI, capture the latest event logs from the Windows Event Logs.  In our environment it winds up helping with both the SQL and non-SQL sides of things,  (Yes, there are apps.  No, I didn't want to muck with them.)

Anyhow, thanks again for the idea, Jen!  

--tbd

Thursday, August 27, 2015

[Presenting] WMI in KC in Oct at SQLSat!

Got the nod.  So, I'm happy to announce I'll be presenting at SQL Saturday 444, Aka SQL Sat KC 2015 (#SQLSatKC, @KCSQLSaturday), on October 3rd, 2015, at Rockhurst University. 

My presentation is a new one, and so I'm still finding neat new things to show everyone.  The subject is WMI, which most SQL folk don't know.  Which makes sense, as it's mostly a Windows technology, meant to give sysadmins info about their boxen.  HOWEVER!  We'll be tackling why you care about it, as well as cool things you can do with it.

More details forthcoming.

But overall, if you're anywhere near Kansas City, I heartily encourage you to attend: flights there are cheap, the day-before BBQ is EPIC, and they have a really good selection of speakers.  In part it's because the KC folk are awesome, partly because their SQLSat is well organized, and finally because they love having speakers from far and wide, which encourages people like me to drive 10 hours to present there.  Oh, and BBQ.  MmmmmMMmmmm BBQ.


http://www.sqlsaturday.com/444/eventhome.aspx






Offtopic: code is available on github

Just an FYI, since I ran into the problem today. 

Many of my posts contain code.  In order to make it easier to view/copy the code, and so I have a backup of it somewhere, I store my code on GitHub.  Specifically, I use gists.

So, most of my newer code is available via https://gist.github.com/mbourgon.

Additionally, I use inline scripts to make this code available.  So if you're using a cross-site-blocker such as RequestPolicy, you'll need to enable github.com in order to view the code.


Wednesday, July 29, 2015

[Eventlog_Tracking] Send a report out when more than 100 messages in a minute

I'm trying to curb the excess eventlog messages in my environment.  Now that I'm using WQL/WMI/PoSH to capture it (writing the presentation now - tentative name is "WMI for the DB Guy/WQL for the DB Gal"), and filtering it as it comes in, I need a way to *AHEM* encourage people to suggest filters, and get rid of error messages. 

So, they're about to get emails with details when it's over 100 in a minute.  With the most frequent errors listed.  It's got a @threshold parameter so you can turn it down if you don't have too many errors.