Here some PowerShell samples to get you going quickly with the Event Log.

I’ve seen EventLogs that are full of noisy and bothersome INFO level messages. For example, we are seeing these “noise” messages about every 10 seconds. So scrolling through the Event Viewer looking for errors can take some time. Obviously, I’m working with other people to stop those messages, but until then… so suppose you want to filter on just the errors for the last 60 minutes?

Use the first command below to get all the errors.

The second shows you how to get all warnings and then further filters by a word in the message.

cls 
$minutes=60 
Get-Eventlog -log application -after ((get-date).addMinutes($minutes*-1)) -EntryType Error 

#Get-Eventlog  -log application -after ((get-date).addMinutes($minutes*-1)) -EntryType Warning  | Where-Object {$_.Message -like '*Mercury*'} 

#Write-Host "---------------------------"
#$el = Get-EventLog -log application -index 1096900 
#write-host $el.Message

The Message will get cut off, but you get a list of the dates and times of the errors or warnings. You can then use the “Index” value to go back the full text of a specific item.

#Write-Host "---------------------------"
$el = Get-EventLog -log application -index 1096900 
write-host $el.Message

See related blog PowerShell EventLog Automated Email

You can also loop through EventLog data and selectively write out whichever fields you want.

cls 
$minutes=90
#Get-Eventlog -log application -after ((get-date).addMinutes($minutes*-1)) -EntryType Warning 

$logs = Get-Eventlog  -log application -after ((get-date).addMinutes($minutes*-1)) -EntryType Warning  | Where-Object {$_.Message -like '*Mercury*'} 

foreach ($log in $logs) 
{
    Write-Host "---------------------------"
    Write-Host "Index=$($log.Index)" 
    Write-Host "DateTim: $($log.TimeGenerated) " 
    write-host $log.Message

}

What I did to cause the error

Dropped an XML file that went through a Map, and was picked up by a Send Port with a CSV Pipeline.
Worked okay if I used PassThru Pipeline.

Suspend Erorr

There was a failure executing the send pipeline: “DL.MGLocationData.Pipelines.MyAppNameLocationDataCSV_Send, DL.MGLocationData.Pipelines, Version=1.0.0.0, Culture=neutral, PublicKeyToken=59a32f7375d7234a”
Source: “Flat file assembler” Send Port: “sp_MyAppNameLocationData_CSV” URI: “e:\Integration\Outbound\MyAppNameLocationData\MyAppName_%datetime%.csv” Reason: Object reference not set to an instance of an object

Solution

Turn on output map validation for your map.

Check the following:
1) Your are mapping something to every single CSV column
2) You don’t have same column names misspelled (one way in map, and another way in the schema)

With BizTalk, we often archive files in a disk directory, where each Vendor has its own archive.

Simple Example

I found the simple example below, then expanded upon:

Get-ChildItem E:\Archive | Measure-Object -Property Length -sum

Source from http://woshub.com/powershell-get-folder-sizes/

Fancy Example

This starts with the Archive directory then basically runs the command above for each subdirectory. One trick was to get the variables from the “measure-object”, and the second trick was how to present the data. Just using Write-Host statements looked very ugly.

To present the data, I wanted to use the Out-GridView. To do that, I had to load a HashTable in a list, and pass that List to the Out-GridView. I haven’t got all the numbers right-justified yet, perhaps soon.

$topDirectories = Get-ChildItem $startDir -Directory  
$countDir = 0 
$List = New-Object System.Collections.ArrayList

foreach ($dir in $topDirectories) 
{
   $countDir = $countDir + 1
   #if ($countDir -gt 1) {exit}
   #Write-Host "Dir: $($dir.FullName)"
   $measure = Get-ChildItem $dir.FullName -Recurse | Measure-Object -Property Length -sum
   Write-Host $dir.BaseName $measure.Count $measure.Sum 

   ## Empty directory seems to have null values 
   if ($measure.Sum -eq -$null) 
   {
       $SizeMB = 0 
   }
   else 
   {
       $SizeMB = $measure.Sum / 1024 / 1024 
   }
   

    $Hash = [ordered]@{
         Directory=  $dir.Fullname 
         Count = $measure.Count 
         ##Size = $size.ToString().PadLeft(15,' ' ) #   $a.PadLeft(15,[char]4)
         SizeMB = [math]::Round($SizeMB *100) / 100   #round to two decimal places 
         }

    [void]$List.Add(( [pscustomobject]$Hash  ))

}

$List | Out-GridView -Title 'Drive Space' 

Sample Output

Scenario

Let’s say you have a directory with a 100 files, and you want to slowly drop those file into BizTalk (via a file adapter Receive Location). Rather than copying all the files at once, you would like to wait 2, 5, or 10 seconds between each file drop? Maybe you want to make the tracked message easier to read, by basically making all the process single-threaded.

What I called “File Dripping” is really a form of throttling. Maybe your BizTalk system is stressed, and you don’t want to throw in 1000s of messages at once. BizTalk has to store all those messages in a database “queue” in the SQL database server. Then it takes longer for BizTalk to ‘walk the chain’ or to keep searching through these when doing various queries in the process of administration and processing. By dropping the files in slowly, the BizTalk Message depth never gets very deep.

Furthermore, you might want the files to be processed in date/time order.

PowerShell Solution

We will use Copy-Item (or Move-Item) – configured by a parameter. And to do a wait for x seconds, we use the Powershell Start-Sleep command in . To get the files sorted, you pipe the Get-ChildItem to the commandlet “Sort-Object LastWriteTime -Ascending”.

cls 
$fromDirName = "e:\Integration\Inbound\AppName\SubDir\"
$toDirName = "e:\Integration\Inbound\AppName\FileDropFolder\"
$fileMask = "*.*"
$waitSeconds = 5
$moveOrCopy = "Copy"  


$files = Get-ChildItem $fromDirName -Filter $fileMask -File  | Sort-Object LastWriteTime -Ascending 
$countMatchingFiles = $files.Count 
Write-Host "Count Matching Files=$countMatchingFiles fileMask=$fileMask" 

foreach ($file in $files) 
{
    if ($moveOrCopy -eq "Copy") 
      {
        Copy-Item $file.FullName -Destination $toDirName
        Write-Host "Copied $($file.BaseName)" 
      }
    else 
      {
        Move-Item $file.FullName -Destination $toDirName
        Write-Host "Moved $($file.BaseName)" 
      }
    Write-Host "Waiting $waitSeconds seconds" 
    Start-Sleep -Seconds $waitSeconds 
}

In the system I’m working with the dates in DB2 are defined as type “DECIMAL”, the they are stored as offsets from 19000000.
Note, it’s not the number of days since 1900-00-00, but literally the the current date (YYYYMMDD less 19000000).

T-SQL converting numeric DB2 date to something more readable

Thus, when using a T-SQL query to the Linked-Server, the date can be converted as follows:

SELECT RDABDT + 19000000 as RDABDT2,  * from Linked.x.dbo.table order by RDABDT desc 

RDABDT2 shows as 20200219. No need to convert when you sort, as the larger decimals are already the more recent dates.

C# – Converting to DB2 Date (integer or decimal)

        public static int EDIDateToDBDateInteger(string CCYYMMDD)
        {
            int rt = 0;
            if (CCYYMMDD == null)
            {
                throw new ApplicationException("EDIDatetoDBDateInteger CCYYMMDD argument passed is null");
            }
            if (CCYYMMDD.Length != 8)
            {
                throw new ApplicationException(
                   "EDIDatetoDBDateInteger CCYYMMDD length is not 8, value received=" + CCYYMMDD);
            }
            rt = Convert.ToInt32(CCYYMMDD) - 19000000;
            return rt;
        }

        public int XMLDateToDB2DateInteger(string convertDate)
        {
            string sMonth = DateTime.Parse(convertDate).ToString("MM");
            string sDay = DateTime.Parse(convertDate).ToString("dd");
            string sYear = DateTime.Parse(convertDate).Year.ToString();
            string sdateWorker = sYear + sMonth + sDay;
            int iDateWorker = Int32.Parse(sdateWorker);
            iDateWorker = iDateWorker - 19000000;
            return iDateWorker;
        }

        public int AnyDateToDB2DateInteger(string dateToConvert)
        {
            string yearStr = DateTime.Parse(dateToConvert).ToString("yyyy");
            string monthStr = DateTime.Parse(dateToConvert).ToString("MM");
            string dayStr = DateTime.Parse(dateToConvert).ToString("dd");
            int db2Date = Convert.ToInt32(yearStr + monthStr + dayStr) - 19000000;
            return db2Date;
        }

What if you knew the big swings and trends in BitCoin and other cryptos.
Marius Landman has able to correctly predict over 90% of the major swings in the market. He grew up as a poor orphan in South Africa, and went on to be on an elite South African military team. The started predicting Health and Safety trends for major oil companies, and found that he could also find similar trends in the cyprtocurrency markets.

A sample from one of Marius Landman’s Cryptocurrency Trend and Forecast Webinar

Why Join the ToNoIt Community

It’s Time to Know It

NOTE: This is not financial advice. This can be one of your tools that help you do your own research.


Register for free ToNoIt account

.

FREE SIGNUP to get to know the system (no payment required). After you register, you can explore the site, and decide if you want to join with the paid membership, which includes updates (usually 2 or 3 per week, including a one to two hour long webinar recording each week). The live webinar is now included in the ToNoIt membership! It currently meets at 4:00pm Central Time on Thursdays. Payments are made only via cryptos.

Here’s an example where I wanted to create a small XML file with a unique list of TerminalIDs from my application.

On the left, which I had to blur out, the Looping functoid is connected to a parent record that repeats, and the other lines are connect to the source of the TerminalID.

The scripting component has the following “Inline C#” code:

public System.Collections.Generic.List<string> duplicateList = new System.Collections.Generic.List<string>();

public bool IsDuplicate( string terminalID )
{
     if( duplicateList.Contains( terminalID) )
        return true;
     duplicateList.Add( terminalID );
     return false;
}

You can see that it creates a collection object called a "Generic.List". It then adds terminalIDs to that list, after checking if the item is already in the list or not. It then returns a true or false.

The logical equals functoid tests the output of the above to "false". I think it is used to make sure we send a boolean true/false to the item on the right side of the map. Maybe it could be avoided, but I saw this in someone else's map, and I left it for now. I made one attempt to remove it and reverse the true/false coming out of Scripting Component, but it didn't work.

In case you didn't know, when set a field on the right side of the map to "false", then that item doesn't get mapped out. You basically suppress it.

The output, when I passed two inputs with "ABC" as the terminal ID is as follows:

<?xml version="1.0"?>
<ns0:UtilListTerminalIDs xmlns:ns0="http://TMW.Integration.Schemas.UtilListTerminalIDs">
  <UtilListTerminalID>
     <TerminalID>ABC</TerminalID>
  </UtilListTerminalID>
</ns0:UtilListTerminalIDs>

Remember that you can right-click the map, and click "Validate" to see the XSLT behind the map. The XSLT file will be shown in the Output window, and you can do a CNTL-Click on the .XSL filename.

Later, I passed two messages to the following C# routine which identifies if a TerminalID was not found in the target database (after doing a query on it). I could have also done the dup logic in the C# routine. This code uses XPath and the XmlDoc.SelectingSingleNode method to lookup the terminal ID in the results of the query. (The database query was previously returned as an XML message in the orchestration.)

I didn't want the orchestration to continue, because it would get an error later when it tries to insert a row into another table that requires the TerminalID to be in the target database.

        public static string FindMissingTerminalIDsInTRMSTResponse(
XmlDocument UtilListTerminalIDs, XmlDocument TRMSResponse)
{
string rt = "";
string xpathTerminalIDs = "//TerminalID";
XmlNodeList terminalNodes = UtilListTerminalIDs.SelectNodes(xpathTerminalIDs);
int countMatches = 0;
int countNodes = 0;
int countMisMatches = 0;

foreach (XmlNode terminalNode in terminalNodes)
{
countNodes++;
string xpathVerify = "//*[local-name()='TSATRM'][text()='" + terminalNode.InnerText + "']";
Console.WriteLine("xpathVerify=" + xpathVerify);

XmlNode testTerminalExistsNode = TRMSResponse.SelectSingleNode(xpathVerify);

if (testTerminalExistsNode != null)
{
countMatches++;
}
else
{
countMisMatches++;
if (countMisMatches > 1)
{
rt = rt + ", "; // comma separate after the first one
}
rt = rt + terminalNode.InnerText;
}

}

At first, I was going to return a boolean from this method, but then I decided that I would suspend the orchestration if the input had missing TerminalIDs, and the message should list all the missing IDs in a string. That went into my orchestration Suspend Shape.

How to run a SQL command against a Microsoft SQL Server Database (MSSQL) from a PowerShell script.

cls 

$datasource = "server=server\instance;database=mydb;trusted_connection=true"

#if not using Integrated Security, 
#you might want to pass user/pass in variables or prompt for them 
#$connectionString = 'User Id=' + $username + ';Password=' + $password + ';' + $datasource
$connectionString = $datasource
write-host $connectionString 
$connection = New-Object System.Data.SQLClient.SQLConnection($connectionString)
$connection.open()

# -- reset so Biztalk ReceiveLocation Polling Process will pick them up 
$updateCommand = "update PollingTable set entryStatus = 'New' where ID in (101, 102)  "

Write-Host $updateCommand 

$Command = New-Object System.Data.SQLClient.SQLCommand 
$Command.Connection = $connection 
$Command.CommandText = $updateCommand 

$rowsAffected = $Command.ExecuteNonQuery() 
Write-Host ("Rows Affected by Update=$rowsAffected") 

$connection.close()  #not sure if this helps... 
                     #but in case you have more code after this... 

As of BizTalk 2020 and Visual Studio 2019, when you click on the map grid, you get the map properties window, on which you can select various “XSLT Transformation Engines”.

BizTalk 2020 supports XSLT 3.0 via the Saxon transformation tool, which is created by Saxonica. Microsoft apparently chose the open source “Home Edition” (HE), and the current version is version 9.

Below are the editions of Saxon 9 provided by Saxonica, which I believe is Michael Kay’s company. He’s one of the founding fathers of XSLT, has pushed through new standards, and has written several book on XSLT and XPATH.

You can also write a .NET program, following an interface, and plug-in other XSLT transformation engines. If you do so, you have to create a small XML file to load other options in the above window.

References: https://docs.microsoft.com/en-us/biztalk/core/technical-reference/xslt-transform-engine-grid-property

The original default options are:

When you create a brand new map (.btm file), the default option is “undefined”, which means that BizTalk will continue to run the original XSLT 1.0 transformation.

I’m still not sure what happens if you select “.Net Framework”.

I found myself doing three boring tasks over and over again after each deploy:


  1. Restarting a BizTalk Host Instance
  2. Running a SQL command
  3. Disable/re-Enable the Receive Location for SQL Polling.

I could potentially add a build/deploy step at the top, to combine that as well… maybe one of these days soon…

To force the SQL polling to run, I had to run a SQL command to change the value of a certain column in one or more rows.

The polling receive port was set to poll every 300 seconds (i.e. 5 minutes). By disabling it and enabling it again, the polling will happen immediately. No one wants to wait an average of 2.5 minutes for their polling to happen!

I also realized that if I got any of these steps done in the wrong order, I might accidentally be re-running with an old DLL, and definitely didn’t want that to happen.

There are a few parms at the top you can set.
I probably should have made the SQL connection strings a parm too.

If I didn’t re-deploy, and just want to poll again, I can set the ynRestartHostInstance to “N”. For example, I might just change the ID of the items to poll. But even then, I would want to disable/enable the Receive Location to make the polling happen faster.

cls 

$ynRestartHostInstances = "N" 
$ynSetSQLToPoll = "Y" 
$appName = "MySuperApp"
$rcvLocation = "rlSQLPolling"

if ($ynRestartHostInstances -eq "Y") 
{
    cd "Biztalk:\Platform Settings\Host Instances"
    Get-ChildItem | ft -auto   


    $hostName = "Microsoft BizTalk Server BizTalkOrchestrations64 dlBizTalkDev1"
    Write-Host "About to start/stop $hostName"              

    Stop-HostInstance $hostName 
    Write-Host "Host Instance Stopped" 

    Start-HostInstance $hostName 
    Write-Host "Host Instance Started"
}

if ($ynSetSQLToPoll -eq "Y") 
{

    $datasource = "server=server\instance;database=mydb;trusted_connection=true"

    #if not using Integratd Security, you might want to pass user/pass in variables or prompt for them 
    #$connectionString = 'User Id=' + $username + ';Password=' + $password + ';' + $datasource
    $connectionString = $datasource
    write-host $connectionString 
    $connection = New-Object System.Data.SQLClient.SQLConnection($connectionString)
    $connection.open()

    # -- reset so polling will pick them up 
    $updateCommand = "update PollingTable set entryStatus = 'New' where ID in (101, 102)  "

    Write-Host $updateCommand 

    $Command = New-Object System.Data.SQLClient.SQLCommand 
    $Command.Connection = $connection 
    $Command.CommandText = $updateCommand 

    $rowsAffected = $Command.ExecuteNonQuery() 
    Write-Host ("Rows Affected by Update=$rowsAffected") 

    #
    # disable/re-enable ReceiveLocation to kick off polling faster 
    #
    Write-Host "Handling ReceiveLocation next:" 
    #cd "Biztalk:\Applications\$($appName)\Receive Locations" 
    cd "Biztalk:\Applications\$appName\Receive Locations"
    Get-ChildItem | ft -auto   


    Write-Host "About to restart $rcvLocation"              

    Disable-ReceiveLocation $rcvLocation
    Enable-ReceiveLocation  $rcvLocation
}