Over the past few months, we've been working on knowing more about our servers. Besides sp_blitz, there's a lot of data we want: DMVs, WMI info, etc. So a coworker & I had a challenge going - which would be a faster way to query our 80+ production servers, SSIS or PowerShell? Well, he had it running faster, but then I asked him to up the number of simultaneous threads and it was a chore for him to change. For me, alter a line of code in a text file. And it seemed easier for me to add more scripts.
So I decided to make mine, while not (necessarily) the fastest, the easiest to use. It uses Powershell and some PS scripts originally written by Chad Miller, Arnoud Jansveld, and several other people smarter than me. I just put it all together.
TL;DR: Drop a query in a folder, and it runs several threads in parallel against all your servers, saving the details out to a table, overwriting the old data if you want.
Pros:
- Trivial install - 6 scripts in a folder, a table with a list of servers, 1 or 2 empty subfolders, and 1 job.
- Low overhead - a sample (simple) script ran on 80+ servers in under 6 seconds.
- Easy to add a new collection - just drop a SQL/PS script in the folder with the name for the table.
- Need more servers done faster? Up the threads.
- Skips servers it can't connect to.
- No powershell extensions needed - just those 6 scripts. PS3 below, modify the one line for PS2.
Cons:
- Does not deploy code; just runs scripts. (so sp_blitz, for instance, needs to be installed separately)
- No failure info. That's on my to-do list, but is not nearly as easy as I'd like
- Datatypes for new tables need to be tweaked if the script creates the table - text fields default to varchar(1000)
To install:
- Create a table with a list of servers to monitor.
- Create 3 folders: c:\sql_tools (or wherever; change the actual script's foldername), and underneath it one for scripts where you want to keep the data, and one where you don't.
- Grab add-sqltable.ps1, write-datatable.ps1, and invoke-sqlcmd2.ps1 from Hey Scripting Guy or poshcode. Put in c:\sql_tools.
- http://blogs.technet.com/b/heyscriptingguy/archive/2010/11/01/use-powershell-to-collect-server-data-and-write-to-sql.aspx
- http://poshcode.org/3679 (Split-Job)
- Save the below script to a file called "repository_extensible.ps1", in c:\sql_tools.
- Create a job with 2 job steps, both as Type: "Operating System (CmdExec)":
- powershell "& c:\sql_tools\repository_extensible.ps1 c:\sql_tools\repository_scripts_keep 0"
- powershell "& c:\sql_tools\repository_extensible.ps1 c:\sql_tools\repository_scripts_delete 1"
- The first script runs scripts where each time, records will be added to the table. The second will delete records from each server, replacing it with the new records.
- Toss a couple sample scripts in the folders
- For SQL, I recommend Paul Randall's excellent Wait & Latches scripts
- http://www.sqlskills.com/blogs/paul/wait-statistics-or-please-tell-me-where-it-hurts/
- http://www.sqlskills.com/blogs/paul/survey-most-prevalent-latch-waits-code-to-run/
- For Powershell, you will need to tell the script what computer to run against. This is because we're not using something like SQLCMD which has a "server" parameter for the entire script.
- Note the " -computername $args[0]" - that allows us to tell it which server to query.
- Get-WmiObject Win32_LogicalDisk -computername $args[0]
- Name each scripts tablename_you_want.sql - the base name becomes the table name
- Run the job once.
- Go into your repository database and modify the table; by default strings automatically become varchar(1000). Yes, it's not great, but will work for now.
2 comments:
Hello, this looks a great technique.
I note the post is quite old though, have you modified the solution for new powershell versions?
Thanks
Rich
Alas, no. It's been humming along now for 4 years now and has been pretty stable. I know there are things I could do with it in newer versions of powershell, but I don't think there's much I'd change. IIRC, the split-job I use already uses workspaces, so there wouldn't be any real advantages there.
If I were to make changes it would be around add-datatable, to better size out varchars, and to add a clustered index when it creates the dataset, I'd be tempted to use Chrissy LeMaire's code to insert rows instead (since it's super-fast), so it wouldn't use as much RAM for variables while it runs.
But for what I'm doing, it works well, so I'm busy solving other problems instead. Thanks for asking, though!
Post a Comment