Friday, December 4, 2009

Replication and DDL Triggers - DO NOT MIX

So, we had started rolling out DDL Triggers, and then today replication broke.

How? A weird ARITHABORT error trying to add a table via the GUI. Weird. So I disable the trigger and add it - at which point replication itself starts throwing the error :

Target string size is too small to represent the XML instance (Source: MSSQLServer, Error number: 6354)
Get help: http://help/6354


Well, it turns out I have an XML trigger on the TARGET database, and it can't deal with the large commands involved in transactions.

How to diagnose and find the exact commands causing problems?


use [replicated_table]
go
sp_helparticle @publication = N'publication_name', @article = 'article_name'
go
use distribution
go
sp_browsereplcmds @article_id = 77 --where 77 is the article_id from above

Wednesday, November 18, 2009

DDL Triggers

We're slowly starting to roll this out, based on the below code. A very well written article; our concern is on performance.

http://www.sql-server-performance.com/articles/audit/ddl_triggers_p1.aspx


--------------------
UPDATE:
Below is the code we originally rolled out, then rolled back due to XML errors with replication (see other posts with tag DDL Triggers)

With my luck it was something stupid in my code, but I haven't gone back and looked - I'm using Event Notifications now.



--1.1 version MDB 20091119.  Removed the XML field as that's a lot of data being held for no reason.
/*
use dba_repo
If Object_ID('dba_repo.dbo.DDL_Event_Log') IS NOT NULL
DROP TABLE dbo.DDL_Event_Log
CREATE TABLE dbo.DDL_Event_Log(

ID int IDENTITY(1,1) NOT NULL,
EventTime datetime NULL,
EventType varchar(15) NULL,
LoginName VARCHAR(50),
ServerName varchar(25) NULL,
DatabaseName varchar(25) NULL,
ObjectType varchar(25) NULL,
ObjectName varchar(60) NULL,
UserName varchar(15) NULL,
CommandText varchar(max) NULL
--,Entire_Event_Data XML
)
go
*/
CREATE TRIGGER [ddltrg_Audit_Log] ON DATABASE -- Create Database DDL Trigger
FOR CREATE_TABLE, DROP_TABLE, ALTER_TABLE,
CREATE_INDEX, DROP_INDEX, ALTER_INDEX,
CREATE_VIEW, ALTER_VIEW, DROP_VIEW,
CREATE_SCHEMA, ALTER_SCHEMA, DROP_SCHEMA,
CREATE_FUNCTION, ALTER_FUNCTION, DROP_FUNCTION,
CREATE_PROCEDURE, ALTER_PROCEDURE, DROP_PROCEDURE,
CREATE_TRIGGER, ALTER_TRIGGER, DROP_TRIGGER,
CREATE_USER, ALTER_USER, DROP_USER
/*
CREATE TRIGGER ddltrg_Server_Audit_Log ON ALL SERVER -- Create Database DDL Trigger
FOR
CREATE_DATABASE, ALTER_DATABASE, DROP_DATABASE
*/
AS
--http://www.sql-server-performance.com/articles/audit/ddl_triggers_p1.aspx
--http://searchsqlserver.techtarget.com/tip/0,289483,sid87_gci1346274,00.html for event types
--See http://msdn.microsoft.com/en-us/library/ms189871%28SQL.90%29.aspx for event types
SET NOCOUNT ON
If Object_ID('dba_repo.dbo.DDL_Event_Log') IS NOT NULL
BEGIN
DECLARE @xmlEventData XML
-- Capture the event data that is created
SET @xmlEventData = eventdata()
-- Insert information to a Event_Log table
INSERT INTO dba_repo.dbo.DDL_Event_Log
(
EventTime,
EventType,
LoginName,
ServerName,
DatabaseName,
ObjectType,
ObjectName,
UserName,
CommandText
-- , Entire_Event_Data
)

SELECT REPLACE(CONVERT(VARCHAR(50), @xmlEventData.query('data(/EVENT_INSTANCE/PostTime)')),'T', ' '),
CONVERT(VARCHAR(15), @xmlEventData.query('data(/EVENT_INSTANCE/EventType)')),
CONVERT(VARCHAR(50), @xmlEventData.query('data(/EVENT_INSTANCE/LoginName)')),
CONVERT(VARCHAR(25), @xmlEventData.query('data(/EVENT_INSTANCE/ServerName)')),
CONVERT(VARCHAR(25), @xmlEventData.query('data(/EVENT_INSTANCE/DatabaseName)')),
CONVERT(VARCHAR(25), @xmlEventData.query('data(/EVENT_INSTANCE/ObjectType)')),
CONVERT(VARCHAR(60), @xmlEventData.query('data(/EVENT_INSTANCE/ObjectName)')),
CONVERT(VARCHAR(15), @xmlEventData.query('data(/EVENT_INSTANCE/UserName)')),
CONVERT(VARCHAR(MAX), @xmlEventData.query('data(/EVENT_INSTANCE/TSQLCommand/CommandText)'))
-- , @xmlEventData
END


Wednesday, October 28, 2009

Piecemeal (aka Partial) filegroup restores from litespeed (sorta)

Okay, I wanted to write this down while I'm thinking about it.

I use filegroups and do filegroup backups. Extensively. Gives you more control and power, since you can backup each one individually, put it in its own location, give it its own schedule, etc, etc. Litespeed helps with this, obviously, since it'll shrink each file.

The downside is that restoring with Litespeed is an all-or-nothing ordeal. You can't restore just the primary, or just one of the filegroups, or just a table (no, object-level restore doesn't work on filegroup backups). Fortunately, SQL 2005 has a new "PARTIAL" option that allows you to restore just the primary filegroup, and then restore others in addition, so I'm using that.

I've already requested this feature for Litespeed, as well as the PARTIAL option, but here's what I had to do to restore some tables from my filegroup backup

0. Take filegroup backups. Take a TLOG backup if needed.

1. Decompress the files I need. Litespeed _does_ offer an "extractor" utility. This will turn your Litespeed BKP into an uncompressed SQL backup. One note: it creates 7 files, and the total will be the same as the uncompressed backup. So make sure you have room. I barely did - thank god for filegroups (or maybe not, since the whole purpose of this post is working around it). So, extract the primary and whichever filegroup you need.
Here's the code to restore a file:
extractor.exe -Fc:\temp\Northwind.bak -Ec:\temp\NorthwindNative.bak
On my system it converts to native at about .75gb per minute (uncompressed size). It's extremely disk-bound. You need to extract the log file too, if you're compressing it (you might not be! Check and see....).


2. Restore the primary filegroup WITH PARTIAL. Make sure to keep your Primary filegroup small. Mine isn't - I need to fix that. Here's the command I used:
RESTORE DATABASE main
FILE = 'main_system_01'
FROM DISK = 'K:\primary.bak0',
DISK = 'K:\primary.bak1',
DISK = 'K:\primary.bak2',
DISK = 'K:\primary.bak3',
DISK = 'K:\primary.bak4',
DISK = 'K:\primary.bak5',
DISK = 'K:\primary.bak6'
WITH partial, move 'main_system_01' TO 'E:\MSSQL_Data\main.mdf' ,
move 'main_log_01' to 'F:\MSSQL_Log\main_log.ldf', --without this, it will fail
FILE = 1, norecovery, nounload, stats = 10


3. Wait a couple hours for the primary to restore. (See? This is why you have a small primary filegroup.)


4. Restore the additional filegroup. You do _not_ use the PARTIAL keyword here. Why? Don't know - using PARTIAL caused it to fail, though my lab notes say to use it. My bet is that you don't need PARTIAL if you're restoring the only filegroup within a backup.
RESTORE DATABASE Main
FILE = 'Main_Secondary_01'
FROM DISK = 'K:\FG_Main_Secondary.bak0',
DISK = 'K:\FG_Main_Secondary.bak1',
DISK = 'K:\FG_Main_Secondary.bak2',
DISK = 'K:\FG_Main_Secondary.bak3',
DISK = 'K:\FG_Main_Secondary.bak4',
DISK = 'K:\FG_Main_Secondary.bak5',
DISK = 'K:\FG_Main_Secondary.bak6'
WITH --partial,
move 'Main_Secondary_01' TO 'E:\MSSQL_Data\Secondary.ndf' ,
FILE = 1, norecovery, nounload, stats = 10


5. Restore the log. I'd say to use use STOPAT so you don't have to keep restoring TLOGs, but I can't find anywhere how to get that to work with Litespeed's restore - I tried adding it to the @with, but then it just failed. Odd. Also, if I extract and restore a log, I get error messages about having to roll the LSN forward to a certain point, whereas I don't get that when running the below command.

EXEC master.dbo.xp_restore_log
@database = 'Main'
, @filename = '\\myserver\SQL_Backups\Main\Main_tlog_200802041030.TRN'
, @filenumber = 1
,@WITH ='RECOVERY'


6. Keep running restores until you get current. The way I did it, I kept running TLOG restores until I got an error message telling me the database is offline. It does this once you hit the last log file prior to the last database backup finishing.

7. Bring database ONLINE. I do it via the GUI.

8. Bring database from SINGLE-USER to MULTI-USER.

[Tuning] Getting file disk usage

Short and simple. Uses the DMV to tell you how much IO each file is using. What you think may be the case is not necessarily the case. And remember that backups count. I'd recommend doing 2+ samples during the day, and comparing. That'll help remove the backups from the end totals. Sample_MS is the number of milliseconds the computer has been up.

SELECT 
  master_files.NAME, 
  master_files.type_desc,
  master_files.physical_name, 
  vfs.sample_ms,
  vfs.num_of_bytes_written, 
  num_of_bytes_read, 
  size_on_disk_bytes/1024/1024/1024 AS Size_in_GB
FROM sys.dm_io_virtual_file_stats(null, null) vfs
INNER JOIN MASTER.sys.master_files 
 ON vfs.[file_id] = master_files.[file_id] 
 AND vfs.database_id = master_files.database_id
ORDER BY (num_of_bytes_read + num_of_bytes_written) desc

Thursday, August 20, 2009

[Compression] Seeing how much savings you'll get

If you meet all sorts of criteria (SQL Server 2008, Enterprise Edition) then you can use row or page level compression on your tables. But, you ask: will it make a difference in space used?

Fortunately, and a bit surprisingly, Microsoft came up with a way to do so.
sp_estimate_data_compression_savings

Sample usage:

sp_estimate_data_compression_savings
@schema_name = 'dbo'
, @object_name = '20090820__abc'
, @index_id = NULL --NULL does all.
, @partition_number = null --if you use partitioned tables
, @data_compression = 'PAGE' --can also use ROW or NONE
go


It looks like it takes roughly 40mb of data, copies it to TEMPDB, and compresses that. It then returns the results, including comparing the size to what the current compression is.

For us, mixed results. One set of EDI data, which uses certain characters to split out values, gets roughly 25% compression. A different set of EDI data that uses XML (with huge swathes of repeating data) get a whopping 1% savings.

I love the idea. I want to use it everywhere - since most systems are IO bound (not CPU bound) it seems a home run. But the requirement of Enterprise Edition lessens its usefulness by a _lot_.

Monday, July 27, 2009

[Jobs] Search job history for a particular date range

Pieces cribbed off several people. Enjoy.


select job_name, run_datetime, run_duration from (
select job_name, run_datetime,
SUBSTRING(run_duration, 1, 2) + ':' + SUBSTRING(run_duration, 3, 2) + ':' +
SUBSTRING(run_duration, 5, 2) AS run_duration
from
(
select DISTINCT
j.name as job_name,
run_datetime = CONVERT(DATETIME, RTRIM(run_date)) +
(run_time * 9 + run_time % 10000 * 6 + run_time % 100 * 10) / 216e4,
run_duration = RIGHT('000000' + CONVERT(varchar(6), run_duration), 6)
from msdb..sysjobhistory h
inner join msdb..sysjobs j
on h.job_id = j.job_id
) t
) t
WHERE run_datetime between '2009/07/24 01:00' AND '2009/07/24 01:59'
order by job_name, run_datetime

[Servers] No linked server? Need inserts? No problem!

Linked servers can be handy. But, also occasionally annoying, and not necessarily where you need them. Fortunately, OPENQUERY to the rescue. Yes, you can use it to do an adhoc query from another server. But, you can also use it to do an insert into another server.

Code:
INSERT INTO OPENROWSET('SQLNCLI', 'Server=yourserversname;Trusted_Connection=yes',
 'SELECT * FROM northwind.dbo.targettable') SELECT FieldA, FieldB, FieldC
FROM sourcetable 


2016/05/11 And here's the generic openrowset syntax you have to use to get SQL authentication.  Posting it here since I just blew 30 minutes trying to get it to work; despite what the documentation says, you have to use (at least on SQL 2012 SP3 querying SQL 2008 SP4)

SELECT * FROM 
OPENROWSET('SQLNCLI', 'Server=myserver;UID=mylogin;PWD=mypassword;',
 'select @@version') 

Monday, June 29, 2009

[Tip] getting local context from sp_ in master

In SQL 2005, if you create an SP in master that queries from system tables (or the INFORMATION_SCHEMA views), it'll return the details from the Master database - regardless of where you're running it. In SQL 2000, it worked.

So, an unsupported workaround, pointed out to me by Erland Sommarskog, SQL MVP & SQL God:
EXEC sp_MS_marksystemobject [your_sp_name]

Friday, June 12, 2009

[Cruft] New DMV in 2008 - SP details

One of the things I've been hoping/waiting/praying for, in SQL 2008, is the ability to see when an SP was last run. In 2000 and 2005 you can do it, so long as it's still in the cache. Which means that things only run sporadically won't occur.

Imagine my happiness:
SELECT * FROM sys.dm_exec_procedure_stats


Extra credit: join to sys.procedures (also new) to get the object.

So, now you can see, since the server was started, what runs. And since your server stays up for several months at a time, you have a good "hit list" of procs that can be scripted out and removed.

I'll update this post when we get a good working version.
(Thanks to Hammer for his help)

Thursday, June 11, 2009

[Cruft] Fixing dependencies

EXEC sys.sp_refreshsqlmodule 'dbo.MyProcFnOrView'

That will update your dependencies table, allowing you to use sp_depends even if the objects weren't inserted in order. New to 2005, thankfully someone at MS realized and fixed sp_depends' weakness.

When objects are added, rows are added to the internal dependency table. However, it's very easy to get out of sync - say if you add SP A that calls SP B, but add them in the reverse order. Which _never_ happens. Sure.

So, anyhow. Use that, then use one of the handy pieces of code online that assume that the dependency table always works.


UPDATE 2012:
or, you could always use the versions that 2008+ have (don't bother on 2005; not there):

sys.dm_sql_referenced_entities
sys.dm_sql_referencing_entities


sys.sql_expression_dependencies

Thursday, April 23, 2009

[Code] Powershell - a basic script for all servers

Just acquainting myself with Powershell - I've known about it, but have preferred using Cygwin's shell. However, I wanted to run some code against all servers, and this seemed a good time to try with Powershell.

I found this original script, which would run the same code against all servers in a list. (http://www.quicksqlserver.com/2009/01/powershell-sqlcmd-and-invoke-expression.html)

foreach ($svr in get-content "C:\MyInstances.txt"){
$svr
invoke-expression "SQLCMD -E -S $svr -i createMyuser.sql"

}


So then I adapted it - using SQLCMD /Lc to get a list of servers on the network, print the server name, then run SQLCMD against each server, running the c:\simplemodel.sql script (while you can run SQL inline, it kept parsing the @@ and choking on it). Not foolproof, mind you, but a good starting point.

foreach ($svr in (sqlcmd /Lc))
{
$svr
invoke-expression "SQLCMD -E -S $svr -i c:\simplemodel.sql"
}

Tuesday, April 21, 2009

[Maint] Quick and Dirty maintenance

Is this suitable most places? No. Is it fast and easy? Yes.

Reindex all tables in a database (2000 & 2005)
exec sp_MSforeachtable "DBCC DBREINDEX ('?')"

or
EXEC sp_MSforeachtable "print '?' DBCC DBREINDEX ('?', ' ', 85)"


Update statistics in a database
For 2000 (since sp_updatestats can break certain things)
EXEC sp_MSforeachtable "update STATISTICS ?"

For 2005:
sp_updatestats

Monday, April 13, 2009

[Cooking] Marathon weekend

So, overall a good weekend. If ever there was a post of mine that qualified as "useless twitter feed", this would be it.

I made the following this Sunday:
Latkes for breakfast - a bit salty when I doubled the recipe. Maybe I subbed Tbsp for Tsp?

Dinner:
Ham (prefab from Sam's), Mixed veggies (prefab steamer bag), Prefab bread (I know, I know - just didn't have time or forethought to do something special), and I made:
Glazed Carrots - worked well, but for some reason I needed to cook them for longer than stated in the recipe. It took a while, and while the glaze wasn't perfect (next time try 1.5 cups of Ginger Ale), it was passable.
Bag Lady's Favorite Chocolate Pound Cake. Don't make this in a Bundt pan - it'll overflow. I used an Angel food pan, and it worked perfectly. Pulled it out after 85 minutes and it was done and came out moist. I used the Hershey's Perfectly Chocolate Icing (recipe below) and that worked wonderfully with it.

I do need to see if there's some sort of liquid-center bundt-cake recipe, where when it bakes enough it's cake, and where it's not it's pudding-like. That would be yummy.

Here's the recipe - I didn't make the cake, just the icing.
http://www.hersheys.com/recipes/recipes/detail.asp?id=184
"PERFECTLY CHOCOLATE" CHOCOLATE FROSTING

1/2 cup (1 stick) butter or margarine
2/3 cup HERSHEY'S Cocoa
3 cups powdered sugar
1/3 cup milk
1 teaspoon vanilla extract

Melt butter. Stir in cocoa. Alternately add powdered sugar and milk, beating to spreading consistency. Add small amount additional milk, if needed. Stir in vanilla. About 2 cups frosting.

Friday, April 10, 2009

[Backups] Restoring Litespeed archival backups to secondary server

After a year or so, databases on my primary server get moved to a secondary server. I wrote the below script to look in a particular for 1 backup, then it determines where to restore it based off the database name. Hopefully someone else can use this.



DECLARE @backup_directory VARCHAR(500), @Verified VARCHAR(100), @full_backup_name VARCHAR(600),
@database_name sysname, @restore_directory VARCHAR(500),
@logical_name_log VARCHAR(50), @physical_name_log VARCHAR(50),
@logical_name_data VARCHAR(50), @physical_name_data VARCHAR(50)
,@with_data VARCHAR(200), @with_log VARCHAR(200)


---------------------------------------------------------------------
--put the name of the folder with the backup you want restored here--
---------------------------------------------------------------------
SET @backup_directory = '\\servername\share\databasefolderbackup'


IF RIGHT(@backup_directory, 1) <> '\'
SET @backup_directory = @backup_directory + '\'

IF EXISTS
(
SELECT * FROM dbo.sysobjects
WHERE id = OBJECT_ID(N'tempdb.[dbo].[#listing]')
AND OBJECTPROPERTY(id, N'IsUserTable') = 1
)
DROP TABLE #Listing
create table #Listing
(resultant nvarchar (255))

--Find the backups in the given folder
declare @dirlist varchar(500)
select @dirlist = 'exec master..xp_cmdshell ''dir /b "'+ @backup_directory + '*"'''
insert #Listing exec (@dirlist)

--get the most recent
select @Verified = MAX(resultant) from #listing WHERE resultant LIKE '%.bak'
DROP TABLE #Listing

--get a list of all logical files within the backup so we can restore with the right data
SELECT @full_backup_name = @backup_directory + @Verified

if object_id('tempdb..#database_details') is not null
DROP TABLE #database_details

CREATE TABLE #database_details
(
LogicalName sysname,
PhysicalName varchar(500),
[TYPE] VARCHAR(2),
FileGroupName varchar(50),
[Size] VARCHAR(50),
[MaxSize] VARCHAR(50))
INSERT INTO #database_details
EXEC MASTER..xp_restore_filelistonly @filename = @full_backup_name

--get the names of everything.
SELECT @logical_name_data = LogicalName, @physical_name_data = PhysicalName
FROM #database_details WHERE [TYPE] = 'D'
SELECT @logical_name_log = LogicalName, @physical_name_log = PhysicalName
FROM #database_details WHERE [TYPE] = 'L'

SELECT @database_name = @logical_name_data

--set up the folders where the restore will automatically go.
SELECT @physical_name_data = CASE
WHEN @logical_name_data LIKE '2009%' THEN 'L:\2009\'
WHEN @logical_name_data LIKE '2008%' THEN 'M:\2008\'
ELSE 'not known'
END
+ right(@physical_name_data, CHARINDEX('\', reverse(@physical_name_data))-1)

SELECT @physical_name_log = CASE
WHEN @logical_name_data LIKE '2009%' THEN 'L:\2009_Logs\'
WHEN @logical_name_data LIKE '2008%' THEN 'M:\2008_Logs\'
ELSE 'not known'
END
+ right(@physical_name_log, CHARINDEX('\', reverse(@physical_name_log))-1)

--need to set these separately since you can't call within the SP
SELECT @with_data = 'MOVE "' + @logical_name_data + '" TO "' + @physical_name_data + '"'
SELECT @with_log = 'MOVE "' + @logical_name_log + '" TO "' + @physical_name_log + '"'

--Now do a restore with MOVE.
exec master..xp_restore_database @database=@database_name
, @filename= @full_backup_name
, @with = @with_data
, @with = @with_log

[Objects] dropping objects via code

The following code works in both SQL 2000 and SQL 2005.

Tables

if object_id('tempdb..#database_details') is not null
DROP TABLE #database_details


Stored Procedures

IF EXISTS
(
SELECT * FROM dbo.sysobjects
WHERE id = OBJECT_ID(N'[dbo].[the_procedure_name]')
AND OBJECTPROPERTY(id, N'IsProcedure') = 1
)
DROP PROCEDURE the_procedure_name


Alternatively, the following code will CREATE a dummy SP if it doesn't exist, then ALTER it. This way you will only CREATE, not DROP, which can come in handy in certain circumstances, since it will save permissions. Note that this uses INFORMATION_SCHEMA, which is more portable than sysobjects.


IF NOT EXISTS
(
SELECT * FROM INFORMATION_SCHEMA.ROUTINES
WHERE ROUTINE_NAME = 'the_routine_name'
and routine_type = 'PROCEDURE' --could also be 'FUNCTION' for a function
)
EXEC ('CREATE PROC dbo.the_procedure_name AS SELECT 1')
GO
ALTER PROCEDURE dbo.the_procedure_name
AS
SELECT *
FROM myTable

Wednesday, April 8, 2009

[Backups] Determine your database growth via backup history

One thing about backups in SQL Server is that the history is kept forever, unless you clean it up using a maintenance plan, or one of the not-very-well-documented SPs.

But we can have it work for us.
Case in point - database growth estimations. This is a basic view that will show what your database growth has been like for the past 60 days. Turn it into a chart with reporting services, and you can see what's growing, at what rate, and what you need to be concerned with.

Yes, this is pretty basic code, but I hadn't seen anybody do this before.
And obviously, if you're cleaning up your backup history this won't necessarily do much.


CREATE VIEW [dbo].[backup_history]
as
SELECT
server_name,
DATABASE_name,
-- catalog_family_number, --not sure what this does; unclear in BOL
backup_size/1000000 AS backup_size,
CONVERT(CHAR(12),backup_start_date,101) AS backup_date--,
-- in case you want to look at a particular type of backup
-- CASE [type]
-- WHEN 'D' then 'Database'
-- WHEN 'I' then 'Differential database'
-- WHEN 'L' then 'Log'
-- WHEN 'F' then 'File or filegroup'
-- WHEN 'G' then 'Differential file'
-- WHEN 'P' then 'Partial'
-- WHEN 'Q' then 'Differential partial'
-- END AS Backup_Type,
-- [NAME],
-- [description]
FROM msdb.dbo.backupset
WHERE [TYPE] IN ('D','F') --full backups, though tlogs could be interesting
AND server_name = @@SERVERNAME
AND database_name NOT IN ('msdb', 'MASTER', 'model')
AND backup_start_date > GETDATE()-60
--ORDER BY SERVER_name, DATABASE_name, backup_start_date, catalog_family_number

Wednesday, March 4, 2009

[Entree] Gordon Ramsey Cookalong Lasagna, Take 2

I've been following Gordon Ramsey's "Cookalong Live", albeit a bit delayed. However, I have to share and show off my attempt at his Lasagna Al Forno. The first time I did full mise en place, and it took about 75 minutes to put together. The second time it took about 45 minutes, and is utterly fantastic. Only takes a few ingredients that you might not have, nothing really too special (except for the pre-cooked lasagna sheets), and comes together VERY easily.

Highly recommended. There's even videos on how to make it.
Two hints
1) Tomato paste, not puree - the brits apparently name things differently.
2) Mince - ground beef, about 2/3 - 3/4 of a pound. I use 85/15 ground beef and it comes out wonderfully.

http://www.channel4.com/food/recipes/chefs/gordon-ramsay/gordon-s-lasagne-recipe_p_1.html

And since they can't be arsed to keep the recipe in one place (or even searchable on their own site...)

2 tbsp olive oil
½ large onion, peeled
1 large carrot, peeled
2 cloves garlic, peeled
2 pinches dried oregano
300g minced beef
1 tbsp tomato purée
1 tbsp Worcestershire sauce
1 bay leaf
30ml (2 tbsp) red wine
1 x 400g tin chopped tomatoes
50ml milk
Salt and freshly ground black pepper
For the sauce
25g butter
25g flour
300ml milk
Pinch of ground nutmeg
60g Cheddar cheese, grated
30g Parmesan cheese, grated
6 sheets of 'non-cook' lasagne sheets
For the salad
1 tsp Dijon mustard
1 tbsp white wine vinegar
2-3 tbsp olive oil
Salt and pepper
1 x round lettuce head, rinsed and dried
METHOD

How to make Gordon's classic lasagne al forno
1. Pre-heat the oven to 220C/ gas mark 7.

2. Heat the olive oil in a hot pan. Grate the onion and carrot and crush the garlic before frying together. Season with the bay leaf, a pinch of oregano, Worcestershire sauce and a little salt and pepper. Allow the onion to soften before making a well in the centre of the pan. Place the mince in the middle of the pan and stir to break it up. Add the tomato puree and allow to cook out for 30 seconds. Continue until all the meat has browned nicely. Add the wine and cook off the alcohol before adding the tomatoes. Leave to simmer for a further 2-3 minutes. Finally add the milk, turn off the heat and set aside (watch Gordon's video on how to prepare bolognaise).

3. To make the cheese sauce, first melt the butter in a saucepan. Add the flour and using a wooden spoon, stir to form a paste. Over a gentle heat add a third of the milk, whisking to prevent any lumps forming. Add the rest of the milk a third at a time, whisking as you go. Season with salt and pepper and a pinch of ground nutmeg. Allow the sauce to cook out for another minute before adding the Cheddar cheese. Stir and remove from the heat (watch Gordon's video on how to make the best white sauce).

4. Spoon half of the meat sauce into the bottom of the baking dish and place pasta sheets on top (break the sheets if necessary to avoid any overlapping). Next, pour in just under half of the cheese sauce, and spread evenly using a spatula before spooning the remaining meat on top. Add the final layer of pasta and use the spatula to pour over the remaining cheese sauce.

5. Finish with the grated Parmesan and sprinkle with another pinch of oregano. Add a light seasoning of salt and pepper before cleaning the edges of the dish and placing in the oven to bake for 20-25 minutes, or until golden brown.

6. In the bottom of your salad bowl use a fork to whisk together the mustard, vinegar and olive oil. Season with salt and pepper. Carefully open the head of lettuce, season inside with salt and pepper. Upend the lettuce headfirst into the salad bowl. Holding the root, wipe the leaves around the bowl to coat in the vinaigrette. Twist the root and pull it out. Turn the dressed head of lettuce onto a large plate and gently open out.

7. Portion out the lasagne and serve alongside the salad.

Monday, March 2, 2009

[Russian] Olga's Haluski (halushki/helushki)

No baking on this one, but a really good Russian version of dumplings. This recipe includes all the stuff we serve it with; potatoes, kielbasa, onions. It's very simple, tastes really good, is very filling, and cheap. Russian poverty food, with some meat added.

Ingredients:
  • Heluski

    • 4 Cups AP flour
    • 1.25 cup warm water
    • 1 teaspoon salt
    • 1 tablespoon oil

  • 3 medium potatoes
  • Vinegar (serve with food)
  • 1-2 pounds of kielbasa/sausage
  • 3 medium onions
    • 1 stick of butter
Onions:
First, we need to cook the onions down, since this takes longer than the other steps. Peel and slice 3 medium onions into rings. Put in a pan, add one stick butter, cover, and cook on medium until clarified. Once this occurs, turn it to medium-high (not high but close to it) and let them fry and caramellize. You want to fry the onions. Once you get them nice and brown, turn to low. Yes, there will be an oil slick of butter; that's fine

Meat:
In another pan, cook kielbasa or a sausage. Cut into whatever size you desire (4-5 inch-long pieces work best). Cook and brown to taste, and once cooked set on low. This step can be done at the same time as the boiling, but doing it ahead of time will make it easier if you're not a whiz in the kitchen.


Haluski:
To make the Haluski itself, mix all 4 ingredients together and knead on a floured board. When I did it, it initially was very dry, but as it was kneaded it came together and absorbed all the flour. The consistency when done was not sticky, and texture was almost plasticine. Now, take a hunk of the dough and roll it with your hand into what looks like a snake - a round, long strip of dough about 1 inch thick. Once you have that, take a butter knife or regular non-sharp knife and cut off slices about 1/4-1/2 inch thick. It will deform as you do it; that's fine. The end result looks like this:

Potatoes:
Peel 2-3 Medium potatoes. Add to pot of water, with about 2tbl salt, and boil until about mostly done. At this point, add haluski, all at once (yes, this'll drop the temperature) and boil approximately 4-5 minutes. The haluski will float to the top when done; you can either fish out the potatoes & haluski, or drain.

Serving:
Get a large serving bowl. Pour the haluski and potatoes into it, pour the onions & butter over them, then take the sausage and place it on top.

Eating:
In our family, we use a serving spoon to grab mass quantities of potatoes/heluski, and a piece or two of kielbasa. Pour a splash of vinegar over the potatoes/onions/heluski - even my wife the Southerner agrees on that. Eat.

[Index] Find duplicate indexes

I've been working on indexes lately, if it's not been obvious. I went looking for code to detect identical indexes, and this one seems the best.

http://sqlblog.com/blogs/paul_nielsen/archive/2008/06/25/find-duplicate-indexes.aspx

Here's the top one, which looks for identical indexes.

-- exact duplicates
with indexcols as
(
select object_id as id, index_id as indid, name,
(select case keyno when 0 then NULL else colid end as [data()]
from sys.sysindexkeys as k
where k.id = i.object_id
and k.indid = i.index_id
order by keyno, colid
for xml path('')) as cols,
(select case keyno when 0 then colid else NULL end as [data()]
from sys.sysindexkeys as k
where k.id = i.object_id
and k.indid = i.index_id
order by colid
for xml path('')) as inc
from sys.indexes as i
)
select
object_schema_name(c1.id) + '.' + object_name(c1.id) as 'table',
c1.name as 'index',
c2.name as 'exactduplicate'
from indexcols as c1
join indexcols as c2
on c1.id = c2.id
and c1.indid < c2.indid
and c1.cols = c2.cols
and c1.inc = c2.inc;

Thursday, February 12, 2009

[Replication] List your subscriptions, including articles, from the subscriber machine

Sometimes you need to know what articles you've received via a subscription. From Computer B (which is a subscriber), you can run this against Computer a (the publisher or distributor) and get a list of articles (tables) that are being replicated. A rare thing to need, but durn handy. You will need a linked server on Computer A that can read from Computer B, and you need to run it against each replicated database.

(remember that are you need to copy even though you can't see the whole line
DECLARE @sqlstatement VARCHAR(1000)
SELECT @sqlstatement = 'SELECT identity(int, 1,1) as ID,
* INTO ##Publications FROM OPENQUERY([computera],
''SET FMTONLY OFF {call replicated_db..sp_helpsubscription}'')'

EXEC(@sqlstatement)
SELECT * FROM ##publications where subscriber = @@servername
DROP TABLE ##publications

Wednesday, February 11, 2009

[Query] Case-sensitive query on a case-insensitive server

SQL Collations play an important role in queries - if you're lucky enough to have a case-insensitive server (which many are; it's the default), then you are saved from having to make sure EveryThing Is Capitalized just Right.

However, there are times when you need it.
http://www.mssqltips.com/tip.asp?tip=1032

SELECT *
FROM dbo.CaseSensitiveTest
WHERE Value1 LIKE '%Test%' Collate
SQL_Latin1_General_CP1_CI_AS
GO


That will give you only the values that are capitalized like "Test" is.

Tuesday, February 10, 2009

[SQL] BULK INSERT and BCP -n

I came across this while rereading the man pages for BCP and BULK INSERT:


http://msdn.microsoft.com/en-us/library/ms188365.aspx

DATAFILETYPE value All data represented in:


native

Native (database) data types. Create the native data file by bulk importing data from SQL Server using the bcp utility.


I like BULK INSERT more than BCP, but you need BCP to export data. I was unaware that you could BCP a file out in Native format, and use BULK INSERT to import it. Doh!

Wednesday, February 4, 2009

Latkes!

This picture does not do them justice. They were awesome. Took about 20 minutes to make, another 20 to cook. Came out really well, considering I've never made latkes (potato pancakes) before. Only change I made was to add more oil in the pan - 2 tbl of oil isn't enough. And since I hadn't made them before, I decided to skip the ricer - layer paper towels and potatoes in a bowl, then push down with all your might and get as much water out as possible.

The below link is safe, but the page with the recipe does have some... interesting pictures.

http://www.boingboing.net/2008/12/29/susie-bright-the-lea.html

Why is this baking? Easy - store them in a warm oven while you make them. And this way you're not worrying about the others getting cold.

[Sweets] Nutella Biscotti

And now for something actually baked.

Recipe from here: http://bakingbites.com/2005/06/nutella-biscotti/ Note that my picture doesn't look as good as their picture. But I will say, it came out quite well following the recipe as written.

Thursday, January 29, 2009

[Profiler] Found: "Error: 208, Severity: 16, State: 0"

So, while running some code, trying to figure out why it's failing, I see this:

Error: 208, Severity: 16, State: 0


What is it, you ask? Some sort of weird bug, maybe, involving temp tables.

Run this:

exec ('exec master..xp_cmdshell ''dir c:\''')

No problems.

Run this:

create table #Listing (results varchar (255))
insert #Listing (results)
exec ('exec master..xp_cmdshell ''dir c:\''')


Bingo.

Why is this the case? I wish I knew.

Friday, January 23, 2009

[Tuning] Getting rid of Dynamic SQL

We've all done it. There's times when you just need to write something quick and dirty that deals with a simple problem. And there doesn't seem to be an easy way to make it proper SQL, so you write some dynamic SQL instead.


CREATE PROCEDURE cases_select_dynamic
@sortBy varchar(100),
@status Int
AS
BEGIN
IF len(@sortBy)>0
BEGIN
DECLARE @sql varchar(max)
SET @sql='SELECT * FROM cases WHERE
status = ''' + CONVERT(VARCHAR(100),@status) + '''
order by ' + @sortBy
EXEC (@sql)
END
ELSE
BEGIN
SELECT * FROM cases WHERE
status = @status
END
END


Fortunately, there are people out there who don't particularly like dynamic SQL. Fortunately for us, because they've found ways around it. Instead of the above, do something like

CREATE PROCEDURE cases_select_nondynamic
@sortBy varchar(100),
@status Int
as
BEGIN
SELECT *
FROM cases
WHERE status = @status
order by CASE @sortBy WHEN 'finished' THEN finished ELSE NULL END,
CASE @sortBy WHEN 'name' THEN name ELSE NULL END,
CASE @sortBy WHEN 'lead' THEN lead ELSE NULL END
END


Which does the same thing. A little more work, but you gain several benefits.
  • Dynamic SQL runs under the users context, so they need perms to the tables. Nondynamic means you just grant normal SP rights.
  • Cached plans. 2000/2005 are good about reusing the cached plan - if it's IDENTICAL. Unless people are always running the exact same query, parameters and all, it won't cache. Nondynamic gets cached once and used for all parameters.
  • Faster. In my case, running the exact same parameters for both versions resulted in the dynamic version having a 89% Query cost (relative to the batch), while the nondynamic version was a mere 11%.


So go out there and convert 1 SP today to be nondynamic. Read Erland's page, he's got a lot of examples and a lot of thought in there.

http://www.sommarskog.se/dynamic_sql.html . Erland is a SQL god.

Thursday, January 15, 2009

[SQL] A very easy way to convert tinyint to hex


SELECT CONVERT(VARBINARY(1), 254)
0xFE


If you need more leading zeroes, raise the size of the Varbinary.


SELECT CONVERT(VARBINARY(8), 254)
0x000000FE

Monday, January 12, 2009

[SQL] Running code on most databases on a server

Everybody should already be using sp_MSforeachDB to do things in all databases.

This is a really cheap way of only doing it to some databases; just add the list in two single-quotes (since you're running it within sp_MSforeachDB)


sp_MSforeachdb 'if ''?'' NOT IN (''tempdb'')
begin
use ?
exec sp_updatestats
end'

Thursday, January 8, 2009

[Notification] and a version to watch a SPID


WHILE (SELECT COUNT(*)
FROM sysprocesses WHERE spid = 230 AND cmd IN ('ALTER table','CREATE index')
)>=1
BEGIN
PRINT 'waiting'
WAITFOR DELAY '00:01:00'
END

EXEC msdb.dbo.sp_send_dbmail
@profile_name = 'DBA',
@recipients = 'dba@thebakingdba.com',
@subject = 'index creation done'

Wednesday, January 7, 2009

[Jobs] Code to email when job finishes

If you have to run a job outside its normal time, and didn't remember to change the notification before you ran it, fret not. By hitting the system tables, you can have it notify you when the job finishes.



WHILE (SELECT COUNT(*)
FROM msdb.dbo.sysjobhistory h
INNER JOIN msdb.dbo.sysjobs j
ON h.job_id = j.job_id
INNER JOIN msdb.dbo.sysjobsteps s
ON j.job_id = s.job_id
AND h.step_id = s.step_id
WHERE h.run_date >= '20090107' and --today's date
j.NAME = 'your_job_name' AND
--max step for the job, so you don't get emailed when step 1 of 5 finishes
h.step_id = 3
)=0
BEGIN
PRINT 'waiting'
WAITFOR DELAY '00:01:00'
END

EXEC msdb.dbo.sp_send_dbmail
@profile_name = 'DBAs',
@recipients = 'thebakingdba@yourcompanyname.com',
@query = 'SELECT j.[name],
s.step_name,
h.step_id,
h.step_name,
h.run_date,
h.run_time,
h.sql_severity,
h.SERVER,
h.run_status
FROM msdb.dbo.sysjobhistory h
INNER JOIN msdb.dbo.sysjobs j
ON h.job_id = j.job_id
INNER JOIN msdb.dbo.sysjobsteps s
ON j.job_id = s.job_id
AND h.step_id = s.step_id
WHERE h.run_date >= ''20090107'' and
j.NAME = ''your_job_name'' AND
h.step_id = 3
' ,
@subject = 'Your Job Finished',
@query_result_separator = ' ',
@query_result_width = 750,
@attach_query_result_as_file = 0 ;