Wednesday, June 9, 2021

SQL Workbench /J & Aurora & MariaDB driver getting "could not load system variables" - FIXED!

 Using the 2.7.3 MariaDB java connector with SQL Workbench /J, I was trying to reach my aurora instance but kept getting this error:

could not load system variables

  • https://community.dremio.com/t/mysql-source-creation-fails/6738/2
  • https://github.com/sysown/proxysql/issues/2523
  • https://github.com/sysown/proxysql/issues/2009
  • https://stackoverflow.com/questions/63283846/issue-integrating-mariadb-client-with-mysql-db
Finally found an answer here:
  • https://jira.mariadb.org/browse/CONJ-824
As per Diego Dupin:

Aurora proxy has a known race condition issue that results in skipping other queries in proxy buffer. connection might be in hang waiting for query response. During authentication, socket has a timeout, that will result throwing the error you describe.

So if `usePipelineAuth` or `useBatchMultiSend` is enable, it might work ... or not. All pipeline options (`usePipelineAuth` and `useBatchMultiSend`) must be disabled when using aurora. Problem has been reported to aurora a few times without correction.


What worked for us was adding this after the DB name 

usePipelineAuth=false&useBatchMultiSend=false

like this:

jdbc:mysql://127.0.0.1:3307/mydbname?usePipelineAuth=false&useBatchMultiSend=false

Friday, June 4, 2021

Powershell - using ODBC and the Simba Driver to query AWS Athena

 I was running into issues with the Linked Server lopping off long JSON that I'm having to pull out from the raw files.  I can't explain it - doesn't appear to be SSMS.  See previous post


But I needed to automate this, rather than use SQL Workbench, save to "Excel" (it was XML), then opening it again and saving it so that instead of 250mb, it's 30mb.  Runs against the previous month, one day at a time (walking the partitions), and then saves to a file.  You got your Athena, your ODBC, your Export-Excel...



#This has two requirements:

# 1 - simba installed as an ODBC driver, called "your_simba_athena_dsn_name_here".  Free from Amazon for this use.

# 2 - ImportExcel.  Requires Powershell V4 (V5?), but for sure v3 won't work.

#Get-ODBC-Data2 is just a slight modification of Anders' code that includes the UID/PWD. Thanks!

#https://www.andersrodland.com/working-with-odbc-connections-in-powershell/

#modified because we have to pass in user/password to Athena.

function Get-ODBC-Data2{

    param(

    [string]$dsn,

    [string]$param,

    [string]$query=$(throw 'query is required.')

    )

    $conn = New-Object System.Data.Odbc.OdbcConnection

    $conn.ConnectionString = "DSN=$dsn;$param"

    $query

    $conn.open()

    $cmd = New-object System.Data.Odbc.OdbcCommand($query,$conn)

    $ds = New-Object system.Data.DataSet

    (New-Object system.Data.odbc.odbcDataAdapter($cmd)).fill($ds) | out-null

    $conn.close()

    $ds.Tables[0]

 }


 #Set start and end.  Those can be whatever, but I made it one month for simplicity

 [datetime]$start_date_yyyymmdd = (get-date).addmonths(-1).tostring("yyyy-MM-01")

 [datetime]$end_date_yyyymmdd = $start_date_yyyymmdd.addmonths(1).tostring("yyyy-MM-01")

 #[datetime]$end_date_yyyymmdd = (get-date).addmonths(0).tostring("yyyy-MM-01")

 $filename = "Myfile_" + $start_date_yyyymmdd.tostring("yyyyMM")

 #set a variable so that as we walk through, we have an array to save everything to.

 $formattedresults=@()

 

 #loop through the days

 while ($start_date_yyyymmdd -lt $end_date_yyyymmdd) {

 $startdate = $start_date_yyyymmdd.tostring('yyyy-MM-dd')

 "starting $startdate"

 $query = @"

 select json_extract_scalar(myfield, '$.fieldjsonparsed') as fielda, dt,

 fieldb, fieldc

  from myschema.mytablename

  where dt='$startdate'

"@

 

 $results = get-odbc-data2 -query $query -dsn "your_simba_athena_dsn_name_here" -param "uid=youraccesskeyhere;PWD=yoursecretkeyhere" 

 #save non-systemdatarow version to our main array

 $formattedresults +=$results|select-object fielda,fieldb,fieldc 

  $start_date_yyyymmdd = $start_date_yyyymmdd.adddays(1)

 }

 

 #export using ImportExcel module, which unfortunately needs at least higher than V3

 $formattedresults| export-excel -AutoSize -BoldTopRow -FreezeTopRow -TableName Test -TableStyle "Light13" -WorkSheetname Test -Path c:\temp\$filename.xlsx


EXECUTE AT against Athena via Linked Server and "does not support the required transaction interface"

 TLDR - using EXECUTE AT with Athena via linked server.  I used https://www.mssqltips.com/sqlservertip/6191/query-aws-athena-data-from-sql-server to get it set up and it works!  But there are some issues.



1) Partition your tables in Athena.  Better performance if you have a decent amount of data

2) This means you can't just do "select from <linkedserver>...tablename", as that doesn't tell Athena to do partition elimination

3) Once you're using OPENQUERY, if you run into issues because it doesn't know what to expect, you can use CAST or CONVERT to force metadata (aka put "cast(fielda as varchar(500))" if it thinks the field is only 255 characters.

4) That doesn't help with over varchar(8000).  You need to move to EXECUTE AT instead, but that seems better then OPENQUERY as it appears to just pass the data back - I got rid of my CASTs in the athena query. Plus, dynamic SQL is easier! You will have to enable RPC OUT for the Linked Server.

5) I couldn't do INSERT INTO EXECUTE AT until I disabled Distributed Transactions in that Linked Server.



Update 2021/06/04 - and there's something screwy with EXECUTE AT.  It runs without the CAST statements and doesn't complain , but doesn't appear to save everything - I have a handful of rows (out of 100k, mind you) that don't match up - one of the fields is lopped off. It's the most annoying thing, since my other option is SQL Workbench which WORKS but isn't easily automated.  After futzing with it for a bit, I wound up moving over to a pure Powershell solution - I'll post that in a few.

Thursday, March 25, 2021

System_health - parsing out queryProcessing node

Needed this today, my pain = your gain.


System_health has a TON of useful info.  However, it's buried in XML, and my shredder (see other posts with the system_health tag) doesn't go this deep.  

sp_server_diagnostics runs every 5 minutes, and there are other posts on getting some of the data out.  But I needed more, and I needed more than just "right now".  System_health saves 32 or so, in 5 minute increments.

This is not complete.  But you do get max workers, max in use, max idle, memory allocations, and non-preemptive wait types, both by count and duration.

Enjoy!


if object_id('tempdb..#systemhealthsessiondata') is not null

    DROP TABLE #systemhealthsessiondata


SELECT CAST(xet.target_data AS XML) AS XMLDATA

INTO #SystemHealthSessionData

FROM sys.dm_xe_session_targets xet

JOIN sys.dm_xe_sessions xe

ON (xe.address = xet.event_session_address)

WHERE xe.name = 'system_health'


SELECT C.query('.') EventXML

, T.C.value('@timestamp', 'datetime')


, T.C.value('(data/value/queryProcessing/@maxWorkers)[1]','varchar(255)') as maxWorkers

, T.C.value('(data/value/queryProcessing/@workersCreated)[1]','varchar(255)') as workersCreated

, T.C.value('(data/value/queryProcessing/@workersIdle)[1]','varchar(255)') as workersIdle

, T.C.value('(data/value/queryProcessing/@tasksCompletedWithinInterval)[1]','varchar(255)') as tasksCompletedWithinInterval

, T.C.value('(data/value/queryProcessing/@pendingTasks)[1]','varchar(255)') as pendingTasks

, T.C.value('(data/value/queryProcessing/@oldestPendingTaskWaitingTime)[1]','varchar(255)') as oldestPendingTaskWaitingTime

, T.C.value('(data/value/queryProcessing/@hasUnresolvableDeadlockOccurred)[1]','varchar(255)') as hasUnresolvableDeadlockOccurred

, T.C.value('(data/value/queryProcessing/@hasDeadlockedSchedulersOccurred)[1]','varchar(255)') as hasDeadlockedSchedulersOccurred

, T.C.value('(data/value/queryProcessing/@trackingNonYieldingScheduler)[1]','varchar(255)') as trackingNonYieldingScheduler

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="MEMORY_ALLOCATION_EXT"]/@waits)[1]','varchar(255)') as MEMORY_ALLOCATION_EXT_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="RESERVED_MEMORY_ALLOCATION_EXT"]/@waits)[1]','varchar(255)') as RESERVED_MEMORY_ALLOCATION_EXT_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="OLEDB"]/@waits)[1]','varchar(255)') as OLEDB_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="WRITELOG"]/@waits)[1]','varchar(255)') as WRITELOG_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="PAGEIOLATCH_SH"]/@waits)[1]','varchar(255)') as PAGEIOLATCH_SH_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="ASYNC_NETWORK_IO"]/@waits)[1]','varchar(255)') as ASYNC_NETWORK_IO_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="IO_COMPLETION"]/@waits)[1]','varchar(255)') as IO_COMPLETION_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="PAGELATCH_EX"]/@waits)[1]','varchar(255)') as PAGELATCH_EX_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="PAGEIOLATCH_EX"]/@waits)[1]','varchar(255)') as PAGEIOLATCH_EX_ct

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byCount/wait[@waitType="CXPACKET"]/@waits)[1]','varchar(255)') as CXPACKET_ct


, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="SQLTRACE_WAIT_ENTRIES"]/@waits)[1]','varchar(255)') as SQLTRACE_WAIT_ENTRIES_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="CLR_AUTO_EVENT"]/@waits)[1]','varchar(255)') as CLR_AUTO_EVENT_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="HADR_FILESTREAM_IOMGR_IOCOMPLETION"]/@waits)[1]','varchar(255)') as HADR_FILESTREAM_IOMGR_IOCOMPLETION_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="TRACEWRITE"]/@waits)[1]','varchar(255)') as TRACEWRITE_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="ASYNC_NETWORK_IO"]/@waits)[1]','varchar(255)') as ASYNC_NETWORK_IO_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="WRITELOG"]/@waits)[1]','varchar(255)') as WRITELOG_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="PAGEIOLATCH_SH"]/@waits)[1]','varchar(255)') as PAGEIOLATCH_SH_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="OLEDB"]/@waits)[1]','varchar(255)') as OLEDB_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="PAGEIOLATCH_EX"]/@waits)[1]','varchar(255)') as PAGEIOLATCH_EX_ms

, T.C.value('(data/value/queryProcessing/topWaits/nonPreemptive/byDuration/wait[@waitType="LCK_M_SCH_S"]/@waits)[1]','varchar(255)') as LCK_M_SCH_S_ms


/*

            <byDuration>

              <wait waitType="SQLTRACE_WAIT_ENTRIES" waits="2559734" averageWaitTime="5898" maxWaitTime="304986" />

              <wait waitType="CLR_AUTO_EVENT" waits="42" averageWaitTime="148452579" maxWaitTime="1161135755" />

              <wait waitType="HADR_FILESTREAM_IOMGR_IOCOMPLETION" waits="6283026" averageWaitTime="500" maxWaitTime="38514" />

              <wait waitType="TRACEWRITE" waits="1656275" averageWaitTime="1892" maxWaitTime="15678" />

              <wait waitType="ASYNC_NETWORK_IO" waits="170646385" averageWaitTime="4" maxWaitTime="34341" />

              <wait waitType="WRITELOG" waits="322871416" averageWaitTime="1" maxWaitTime="166927" />

              <wait waitType="PAGEIOLATCH_SH" waits="204943649" averageWaitTime="1" maxWaitTime="157813" />

              <wait waitType="OLEDB" waits="996781932" averageWaitTime="0" maxWaitTime="158157" />

              <wait waitType="PAGEIOLATCH_EX" waits="32693084" averageWaitTime="1" maxWaitTime="121494" />

              <wait waitType="LCK_M_SCH_S" waits="10554" averageWaitTime="4387" maxWaitTime="94983" />

            </byDuration>

          </nonPreemptive>

          <preemptive>

            <byCount>

              <wait waitType="PREEMPTIVE_OS_AUTHENTICATIONOPS" waits="1935312" averageWaitTime="0" maxWaitTime="206" />

              <wait waitType="PREEMPTIVE_OS_CRYPTOPS" waits="1391671" averageWaitTime="0" maxWaitTime="143" />

              <wait waitType="PREEMPTIVE_OS_QUERYREGISTRY" waits="912266" averageWaitTime="0" maxWaitTime="188" />

              <wait waitType="PREEMPTIVE_OS_WRITEFILE" waits="817138" averageWaitTime="30" maxWaitTime="145310" />

              <wait waitType="PREEMPTIVE_OS_GETPROCADDRESS" waits="637470" averageWaitTime="0" maxWaitTime="15" />

              <wait waitType="PREEMPTIVE_XE_CALLBACKEXECUTE" waits="629446" averageWaitTime="0" maxWaitTime="9" />

              <wait waitType="PREEMPTIVE_OS_AUTHORIZATIONOPS" waits="560759" averageWaitTime="0" maxWaitTime="59" />

              <wait waitType="PREEMPTIVE_OS_CLOSEHANDLE" waits="494332" averageWaitTime="1" maxWaitTime="124284" />

              <wait waitType="PREEMPTIVE_OS_CRYPTACQUIRECONTEXT" waits="464831" averageWaitTime="2" maxWaitTime="129" />

              <wait waitType="PREEMPTIVE_OS_REVERTTOSELF" waits="96931" averageWaitTime="0" maxWaitTime="36" />

            </byCount>

            <byDuration>

              <wait waitType="PREEMPTIVE_OS_WRITEFILE" waits="817138" averageWaitTime="30" maxWaitTime="145310" />

              <wait waitType="PREEMPTIVE_OS_CRYPTACQUIRECONTEXT" waits="464831" averageWaitTime="2" maxWaitTime="129" />

              <wait waitType="PREEMPTIVE_OS_CLOSEHANDLE" waits="494332" averageWaitTime="1" maxWaitTime="124284" />

              <wait waitType="PREEMPTIVE_OS_AUTHENTICATIONOPS" waits="1935312" averageWaitTime="0" maxWaitTime="206" />

              <wait waitType="PREEMPTIVE_OS_QUERYREGISTRY" waits="912266" averageWaitTime="0" maxWaitTime="188" />

              <wait waitType="PREEMPTIVE_OS_FILEOPS" waits="18032" averageWaitTime="16" maxWaitTime="3488" />

              <wait waitType="PREEMPTIVE_OS_REPORTEVENT" waits="2032" averageWaitTime="135" maxWaitTime="102489" />

              <wait waitType="PREEMPTIVE_OS_CRYPTOPS" waits="1391671" averageWaitTime="0" maxWaitTime="143" />

              <wait waitType="PREEMPTIVE_OS_FLUSHFILEBUFFERS" waits="1924" averageWaitTime="60" maxWaitTime="3880" />

              <wait waitType="PREEMPTIVE_OS_CREATEFILE" waits="3048" averageWaitTime="26" maxWaitTime="6397" />

            </byDuration>

          </preemptive>

        </topWaits>

*/


FROM #SystemHealthSessionData a

CROSS APPLY a.XMLDATA.nodes('/RingBufferTarget/event') as T(C)

--cross apply T.C.nodes('/event/data') as E(D)

where 

 T.C.query('.').value('(/event/@name)[1]', 'varchar(255)') in ('sp_server_diagnostics_component_result')

and  C.value('(data/text)[1]','varchar(255)')= 'QUERY_PROCESSING'


Thursday, October 8, 2020

UNION ALL returns "conversion... overflowed an int column"

I had a stored procedure throwing the following error:

The conversion of the varchar value '9999999999' overflowed an int column.  


There's two queries joined, so I split them up to see which one was throwing the error.  

Neither.  Of course.  : - \

If you break the query apart, both halves work.  


Okay, so let's figure this out:

exec sp_describe_first_result_set @tsql = N'myqueryhere'

exec sp_describe_first_result_set @tsql = N'myotherqueryhere'


Oh, hey look, varchar in one, int in the other.


Cause? My own stupidness from 7 years ago:

case when fielda is null or fielda = 0 then 0

else replace(replace(convert(varchar(14),fielda(,'.',''))

end


On one set of data, there are fields that have values, specifically the 9999999999; the second set returns nothing.  

So it tries to convert to the "least" datatype, the INT, and the 9,999,999,999 is too large for INT, so it fails.

Tuesday, June 9, 2020

[AWS] Athena aka Hive/Presto - renaming/mapping a JSON attribute named TIMESTAMP to be called TS

(More Athena stuff coming, probably)

Since I couldn't get Amazon's documentation example with  ColumnToJsonKeyMappings to work... (fortunately, the Hive documentation is better, and Athena uses Hive for DDL)

My JSON looks like this (which I'm not formatting, because Athena demands every row be on one line, with no fancy "line feeds" or "carriage returns")

{"event":"eventififer","globalid":"3fd6dce3-6650-4e3f-8d8d-a46d1baaca02","timeStamp":"2020-06-08T19:29:31.114Z"}

In this case, "timestamp" is a reserved word, so you have to query it like:
and "timestamp" >= '2020-06-08T18:30:55Z' 

So, in order to rename the field, you could either create a view on top of it, or rename it during the creation.

CREATE EXTERNAL TABLE myawsdatacatalog.mytablename(
event string ,
globalid string,
ts string)
partitioned by (dt string)

ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
with serdeproperties ("mapping.ts"= "timestamp")
LOCATION 's3://mybucket/my/folder/path/';

Yes, the ts is a string, because it has to meet their particular format EXACTLY. If I try TIMESTAMP for the data type for ts, it creates fine! But when I try to query it, I get the following. 

HIVE_BAD_DATA: Error parsing field value '2020-06-04T20:01:47.829Z' for field 3: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]

If you need a workaround, I'd say to create it as string, then toss a view on top with: 
select from_iso8601_timestamp(ts) as ts

Thursday, March 12, 2020

[Baking] Olga's Prune Cake

Interesting! I googled this (Because my copy of the recipe is typed up, which means Baba came across it and liked it, and so kept making it) and found these rather-similar recipes.  My copy is at least 35 years old, so I have no idea who gets the credit.  Except me, when I make it.  I get so much credit.   (I do dig the bit about soft-ball stage - we note the color change and do it then)


First of all, I grew up thinking this was kinda gross, because who wants to eat baby food? Once I became a parent, I learned that baby food has no extra ingredients, making it ideal (and removing the food processor requirement from the recipe). 

Second of all - my word this is simple and delicious. We made it with a spoon and like 2 bowls in about 10 minutes. The icing takes a little longer and requires a large (5-quart+) bowl to make, because of the crazy reaction when boiling buttermilk + baking soda.  Remember, kids, heat makes reactions stronger!

1.5 cups sugar
1 cup salad oil (preference is Mazola corn oil, but Wesson will also work)
1 tsp salt
2 cups flour
1 tsp soda
1 tsp nutmeg
1/2 cup buttermilk
1 tsp cinnamon
3 eggs (jumbo)
1 cup prunes (we use 2 of the small jars of beech-nut)
1/2 cup nuts (we use pecan, in pieces)
1 tsp vanilla (preferred Mexican vanilla)

Combine sugar, oils and eggs in large bowl and mix well (I mix until the color starts to change.) Sift dry ingredients (so measure out 2 cups of flour into the sifter, pour the other dry on top, and then sift it all). Add prunes, buttermilk and vanilla. (This definitely thins out the batter and makes the whisk you used on the first part usable again). Stir in nuts. Bake at 350F for 45 minutes.

Topping/icing/frosting/whatever: 
1 cup sugar
1/2 tsp baking soda
1/2 cup buttermilk
1/2 cup butter or margarine (1 stick)
Put all in 4+ quart pot. Mix everything in. Boil for 5 minutes. At about 5 minutes the color changes to be more yellow, and at that point, we pour it over.

Poke a bunch of holes in cake, and, while this and the cake are still hot, pour this over the cake. 


The cake is together...


Making the foamy frosting...



After pouring the icing over...


And after it's absorbed.

Thursday, February 6, 2020

[Powershell] Getting the full text out of a field in SQL when SSMS won't work

We have some code that is basically table-driven SSIS. The advantage is: no digging through SSIS! The disadvantage is that if someone decides to put in code instead of a stored procedure, it's hard to get it out of SSMS.

I wish I knew a better way to handle this, but this works fine.

#This uses the wondrous invoke-sqlcmd
. c:\powershell_scripts\invoke-sqlcmd2.ps1

$query = @"
SELECT source_sql FROM etl_me.dbo.data_flow_me_config where id = 1
"@

$fullquery = invoke-sqlcmd2 -ServerInstance mem-pr-cde-01 -Query $query


First, to get the data out, I tried:
$fullquery.source_sql

Then I realized I could do this, if I only had one column:
$fullquery[0]

But that can fall down.  This one is slightly better.  Add more columns if you need them - if they're nonexistent, it still works.

$fullquery|%{$_[0] #, $_[1], $_[2]
"-------------------"}

Tuesday, December 10, 2019

MS Teams - posting results of a SQL query as a table via Powershell.

(update 2019/12/10 13:33 - fixing the tables; the formatting I included got screwed up because Blogger decided to render it)

TL;DR - below is code that will post the results of a SQL query, as a table, in a channel within MS Teams.

Special thanks to http://mikeconjoice.com/2018/05/25/send-a-notification-to-microsoft-teams-via-powershell/, who had the original post/code I used.

First of all - MS, you should be embarrassed. All the documentation says to use Office 365 Connector Cards/Adaptive Cards, but it doesn't work for teams as of December 2019 ("Supported in Teams, not in Bot Framework."), and there's no sense that it'll ever happen, given there's a Connect/User Voice that's been open a year with no updates.

A) Here's some documentation that doesn't actually work for what we're doing.
https://docs.microsoft.com/en-us/microsoftteams/platform/webhooks-and-connectors/how-to/connectors-using
https://techcommunity.microsoft.com/t5/Microsoft-Teams/Microsoft-Teams-Incoming-Webhook-Message-Formatting/m-p/31984
https://docs.microsoft.com/en-us/outlook/actionable-messages/send-via-connectors
https://messagecardplayground.azurewebsites.net/
https://stackoverflow.com/questions/58598592/microsoft-teams-escaping-underscores-in-text-posting-to-webhook

* Note there's a max of 20711 characters in the JSON, and 10 sections in the JSON. (Somewhere else on StackOverflow it says 28k for an image is the max size of that)
* Note there's also a limit how often you can post, and it's a "back-off" algorithm, IIRC.

So.... you want to post to a Teams channel automagically.  Should be simple, and it is!  Alas, it means you have to ignore most of the documentation.  Let's do this!

1) Here's how to hook it up with your channel.  Note that when I created a brand new "Team", it took about 5-10 minutes before I was able to add the webhook connector - prior to that, I got a "channel does not exist or has been deleted".
https://docs.microsoft.com/en-us/outlook/actionable-messages/send-via-connectors
(scroll down)

2) Getting a basic post working - this works and will post text.  Note that you can split lines by using "/r/n" in the text, because it sort of supports formatting (see #3)

$webhook_url = "https://outlook.office.com/webhook/  ......" #URL when you added the connector

$json = @"
{
  "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
  "text": "Hi Mom",
  "version": "1.0",
  "type": "AdaptiveCard"
}
"@

Invoke-RestMethod -Method post -ContentType 'Application/Json' -Body $json -Uri $webhook_url


3) Adding a table:
Basic HTML is supported.  And I mean BASIC. It supports less formatting than Outlook (I had a border and had to remove it).


So the JSON would look like:




4) Putting it all together, we can do something like this, which will query SQL Server using invoke-sqlcmd2, create the proper JSON, and post it to our channel (note the below SQL is incomplete):



5) And here's what the powershell script returns, if successful:

1

and in Teams...


If you don't do it right, you'll get errors like:
Invoke-RestMethod : Summary or Text is required.
or (the typical "you didn't do it perfectly") of:
Invoke-RestMethod : Bad payload received by generic incoming webhook.

Wednesday, November 13, 2019

[Powershell] loading a module if the SQL Server Agent can't do so automatically

I'm trying to use the SQL Server Agent to run a Powershell script that calls AWS.

But it failed.  

The term 'get-awscredentials' is not recognized as the   name of a cmdlet, function, script file, or operable program. Check the   spelling of the name, or if a path was included, verify that the path is   correct and try again.  

Which is odd because if I start a brand-new ISE on the box, it works.  Grr.

Okay, I must've forgotten the module.  But again, I don't need it with the ISE...

import-module awspowershell

But I got a different error:

import-module : The specified module 'awspowershell' was not loaded because no   valid module file was found in any module directory.  

Okay, so let's add it.

Where the heck is the module, anyways?
From the ISE, after loading the module by hand, run:

(Get-Module -ListAvailable awspowershell*).path


Then, now that we know that path:

import-module -name "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\awspowershell"
 
(note awspowershell is in there twice - it's looking for the... PSD1, I think?)