A few years ago I discovered a great utility called “Auto Audit” on CodePlex by Paul Nielson and John Sigouin. The best thing is that it creates triggers that audit insert, update, and deletes on any desired table. It creates a global audit table, and creates triggers specific to each of your tables.  It can optionally add these columns to each table: CreatedDate, CreatedBy, ModifiedDate, ModifiedBy, and RowVersion (incrementing INT).  It includes views to see your audits, and it does add eight of its own tables to your database (all nicely put in the ‘Audit’ schema).  It also logs who is doing DDL commands, so you know, for example, which DBA or which developer, added some column to some schema.

It’s like a mini-framework to make sure all your triggers and auditing are done in a consistent manner, and it writes all the tedious trigger code for you. This of course allows you to see who changed what and when. The other benefit is that it standardizes the process, and builds the trigger for you. I’ve been in other companies where we had similar functionality, but it was all hand-coded, and rather tedious to create and update the triggers.

When you download it, what you get is just one big .sql file, 6422 lines long.  You run that script in each database in which you want to use Auto Audit.


I had a database called NealDemo with one table dbo.Employee (in red box below).  Running the Auto Audit script added 8 of it’s own tables, but as of yet, it does not touch the Employee table.  We have to do the next step for that to happen.


Running the above will also create some stored procedures, views, and most importantly SchemaAuditDDLTrigger DDL Trigger.  This will allow you to see who for instance modified a table structure, because that will get audited too.

 Turning On Audit for a Table


So to turn on auditing for one single table, you do this (overriding the schema or other parms as desired):

exec [Audit].[pAutoAudit] @TableName=’Employee’, @BaseTableDDL=1

Note: If you don’t specify @BaseTableDDL=1, the 5 columns won’t get added to the table.  Below is an example of my employee table, and it’s fairly obvious which 5 columns were added.

If you are ready to turn it on everywhere, on all tables, instead use pAutoAuditAll.


It created three triggers on my Employee table.  The update trigger alone is 305 lines of code, and the other two are about 150 lines each.


Viewing the Audits

I ran the following update:


Rather than creating one audit table for each table, Auto Audit uses one consolidated audit table:



However, it does create a view for each table.  In addition it creates a view called Employee_Deleted to show any rows deleted from my Employee table..


Now look what happens when you “Alter” a table.  There is a Database Trigger called “SchemaAuditDDLTrigger”. It catches the changes and logs them, as well as rebuilds the triggers on the Employee table.

Table Function

There’s also a table function that can be used on a single key.  Below I show the differnce in the vAuditAll and using the function against that same key:



Auditing DDL Changes

Now look what happens when you “Alter” a table.  There is a Database Trigger called “SchemaAuditDDLTrigger”. It catches the changes and logs them, as well as rebuilds the triggers on the Employee table.


To view who changed the table or anything in the schema using DDL, check the Audit.SchemaAudit table:




I’m so impressed with this utility.  If you have no auditing at all on your database, you can turn this on in just a matter of minutes.  Note however, that it could impact your performance, so be sure to benchmark if you are pushing the boundaries on performance already.



Powershell can be useful for parsing or harvesting data from the web via means of the “Invoke-WebRequest”. Among other things, it can return a collection of links. The code below loads a page from Wikipedia and loops through the collection of links, looking for a certain pattern to find all the cities in a state. It then writes the “city, state” pair to a file.

Calling “Invoke-WebRequest” returns a Microsoft.PowerShell.Commands.HtmlWebResponseObject (the variable $site in my sample code below), which is part of the Microsoft.PowerShell.Commands.Utility assembly. Useful members inclued AllElements, Forms, Headers, Images, InputFields, Links, ParsedHTML, RawContent, and StatusCode (see complete list here: HTMLWebResponseObject Members.

$state = "Texas" 
#example: http://en.wikipedia.org/wiki/List_of_cities_in_Texas 
$url = "http://en.wikipedia.org/wiki/List_of_cities_in_$state" 
$harvestFile = "c:\TexasCities.txt" 
$site = Invoke-WebRequest -Uri $url 
#$elements = $site.AllElements | where ($_.id -eq "100 Largest Cities in Texas by Population") 
#Write-Host "Matches = $($elements.length)"
foreach ($link in $site.Links) 
    $textLink = $link.href
    if ($link.href.StartsWith("/wiki/") -and $link.href.EndsWith("_$state") -and $link.title.EndsWith(", $state") )
        # this is our signal to stop processing, the cities repeat now by descending order of population 
        write-host "$($link.innerText) $($link.href)" 
        $outrow = "$($link.innerText), $state" 
        add-content $harvestFile $outrow   #write name of city to output file 
        write-host "Other $($link.innerText) $($link.href)" 
        if ($link.href.StartsWith("#cite_note-2"))
                Write-Host "Stopping because found #cite_note-2" 

The only trick above is when to stop parsing. Of course, with any parser, if the web page changes, the parser might break, and need updates. I’m using an anchor tag that notes the beginning of a list of the 100 largest cities in the state, sorted by descending population.

Example Data Harvested

Abbott, Texas
Abernathy, Texas
Abilene, Texas
Ackerly, Texas
Addison, Texas
Adrian, Texas
Agua Dulce, Texas
Alamo, Texas
Alamo Heights, Texas

Today’s Powershell example is something that I know I will re-use, so I wanted to put it in a place where I could always find it, i.e. on my blog.

The main purpose of this example is to show how to route output from a cmdlet to a formatted or “pretty” output, such as as CSV File that can be opened by a spreadsheet such as Excel, a Powershell Grid, or an HTML file that can be opened by your favorite browser. If you create HTML to a string variable, instead of a file, it can also be used to send as the body of an email.

The second purpose is to put in one place all the common options for getting files from some disk/directory structure (using Get-ChildItem). Hopefully, I have taken information from 5 or 6 sources and condensed it down to one source; with comments of what you can tweak and change as needed.

#simple take cmdlet and route to CSV (your choice of whether to -Append or remove it to replace). 
Get-Process firefox | Export-CSV -Append -Path c:\test.csv -NoTypeInformation 


Get-ChildItem -Recurse c:\Demo -File |  ## optionally -include *.txt,*.bak to match any combined set of file masks 
# where {!$_.PsIsContainer} | ## old way was use to PsIsContainer means it is a folder 3.0 has -Directory and -File parms 
# where Length -gt 50 |  ## use to filter by other properties   ## Note: Length is the file size 
ForEach-Object {$_ | add-member -name "Owner" -membertype noteproperty -value (get-acl $_.fullname).owner -passthru} | 
Select-Object FullName, Length, CreationTime, LastAccessTime, LastWriteTime, IsReadOnly, Attributes, Directory, Extension, Owner | 
Sort-Object FullName | 
#choose which output type you want - only one of the following 
Export-CSV -Path c:\demo.csv -Force -NoTypeInformation 
#ConvertTo-HTML -CssUri "$($PSScriptRoot)\htmlStyle2.css" | Out-File  C:\Demo.html 


Creating an HTML File Formatted with a CSS Stylesheet

The following effect is possible (Viewed in IE Browser):

When writing to an HTML file, a css stylesheet and pretty things up. Here is the example used above. I found it here: JohnSardin sample css for tables
Here’s his download link. You can download his code, and take out everything in the HTML style tags, and save in htmlStyle2.css. The parm to the -CssUri (in my code above) is on the ConvertTo-HTML cmdlet). I use $($PSScriptRoot) so that it can find the file in the current directory where the .ps1 file is running from (rather than needing to fully qualify the file with a hard-coded directory name).


Note: that the ConvertTo-HTML command does not include the THEAD and TBODY tags to be included. So if you find a stylesheet that uses them you may have to tweak it a little (or use RegEx to insert them into the file in the proper places). I experimented with some stylesheets from this site: 40 Free Beautiful CSS CSS3 Table Templates, but I was not having luck getting them to work.

CSV Viewed in Excel


Here is the CSV data in raw format. CSV of course stands for Comma Separated Value.

"C:\Demo\Demo1.txt","12","2/2/2015 10:25:50 AM","2/2/2015 10:25:50 AM","2/2/2015 10:25:59 AM","False","Archive","C:\Demo",".txt","abc\neal.walters"
"C:\Demo\Notes.bak","12","2/2/2015 10:48:09 AM","2/2/2015 10:48:09 AM","2/2/2015 10:25:59 AM","False","Archive","C:\Demo",".bak","abc\neal.walters"
"C:\Demo\Notes.txt","12","2/2/2015 10:26:03 AM","2/2/2015 10:26:03 AM","2/2/2015 10:25:59 AM","False","Archive","C:\Demo",".txt","abc\neal.walters"
"C:\Demo\ReleaseNotes.txt","110","2/2/2015 10:26:18 AM","2/2/2015 10:26:18 AM","2/2/2015 10:26:41 AM","False","Archive","C:\Demo",".txt","abc\neal.walters"
"C:\Demo\SubDir\Demo2.txt","12","2/2/2015 10:44:18 AM","2/2/2015 10:44:18 AM","2/2/2015 10:25:59 AM","False","Archive","C:\Demo\SubDir",".txt","abc\neal.walters"

Powershell’s Built-in GridView

If you don’t need to export the data to another program, then the Out-GridView might be all you need. It also has the ability to sort and filter data. Filtering is available when you click the “Add Criteria” button.


Here’s an example of using the add criteria Filter (results not shown)


And here is an example of just typing a text filter on the top row:


This allows you to interactively “play” with your data. click any column title to sort. For a really fast filter, just type in the box at the top that say in light gray on white “Filter“. This searches the entire row for the raw text you type in. So in my case, I could type .bak, or “SubDir” to quickly limit the output to a certain row.