Two Issues with Variable Initial Values

When doing a build of a BizTalk Orchestration, I saw this error:

illegal escape '\I'
illegal escape '\C'
illegal escape '\D'

The issue was that I had an orchestration variable that had the following ”
“inital value” (in the properties window): “e:\Integration\Config\DL.TL2000.Shipment.config.
Just like in C#, changed to add the @ sign so that blackslash would not be an escape character.

Similar Common Issue

If you put the set the initial to a string such as ‘abc’, but you don’t put in quotes around, you will get these two error:

1) identifier 'abc' dos not exist in "OrchestrationName'; are you missing an assembly reference.
2) cannot find symbol 'abc' 

Background

BizTalk let’s you create a map that uses XSLT instead of the functoids/GUI mapping. To do this, you click on the mapping grid, go to the “Grid Properties” window, and put in a filename for the “Custom XSLT Path”. I usually name it the same as the map file, just substituting the .xslt for the .btm file suffix.

In your XSLT file

Take some typical functoid map that uses a C# external library. If you look at the generated XSLT (right click the map and select “Validate”, then open the XSLT file_), you will see something like this:

xmlns:ScriptNS0="http://schemas.microsoft.com/BizTalk/2003/ScriptNS0"


Actually there will be one of these for each class that you reference.

The actual call to the c# may look like this:

   <xsl:value-of select="ScriptNS0:XMLDateToTimeHHMMSS($parmDateTime)"

In a normal map, it may be hard to figure out which ScriptNS# correlates to which class/library/.DLL. (Of course you have to reference that .DLL in your References for that project.)

Create your own Custom Extension File

This is where the magic happens! This file ties together your .DLL/classname to the ScriptNS# namespace to be used in your custom XSLT code.

Note: You don’t have to use ScriptNS0. IN the example below, I used ScriptDateFunctions, since the class name is DateFunctions. It doesn’t even have to have the word “Script” in it, but I leave that there to make sure it’s more clear.

<ExtensionObjects>  
   <ExtensionObject  
      Namespace="http://schemas.microsoft.com/BizTalk/2003/ScriptDateFunctions"  
      AssemblyName="ABC.Common.Helpers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=a6349ee3fd01c3ec"  
      ClassName="ABC.Common.Helpers.DateFunctions" />  
</ExtensionObjects>  

You can get the AssemblyName, Version, Cultural and Public Key Token from the GAC (Global Assembly Cache). Navigate to your the folder:
c:\windows\microsoft.net\assembly\GAC_MSIL\ABC.Common.Helpers\

Repeat the “ExtensionObject” for each className. Note that if your .DLL has 4 classes, then you would need 4 ExtensionObjects (if you want to call all methods in all 4 classes).

I usually save this as a file called CustomExtension.xml. You could have one for each map, or one shared across all your maps.

Setup the Custom Extension for your Map

Just as you set the “Custom XSLT Path” in a previous step, click on the map grid, go to the “Properties” window, and paste or select your filename for “Custom Extension XML”.

Back to your XSLT

In the xsl:stylesheet root element, make sure you add your namespace:

xmlns:ScriptDateFunctions="http://schemas.microsoft.com/BizTalk/2003/ScriptDataFunctions"

Make sure the namespace corresponds to the class that contains the method you will call. For example, my “DateFunctions” class has a method called “XMLDateToTimeHHMMSS”.

   <xsl:value-of select="ScriptDateFunctions:XMLDateToTimeHHMMSS($parmDateTime)"

Test Your Map

Provide a sample instance to test the map. (Click on the .btm file in Solution Explorer), then in the “Properties” window, paste in the name of a file for the “TestMap Input Instance”. Now right click the .btm filename in Solution Explorer, and click “Test Map”. As usual, check for any errors in the “Output” window, and CNTL-Click on the file at the bottom where it says “Output is stored in the following file…”

Possible Errors

Prefix ‘ScriptDateFunctions’ is not defined.
You didn’t include the namespace at the top of your XSLT file.

Summary: We have now used the BizTalk “Custom Extension XML” to allow you to make calls to C# from your custom XSLT maps.

I got this error when doing custom XSLT with a BizTalk map.

It was a case of “brain fog”. The proper keyword is xsl:choose, not xsl:case. Other languages call this a switch or a case. I just temporarily forgot that XSLT calls it a “choose”.

Example of what the xsl:choose looks like (from https://www.w3schools.com/xml/xsl_choose.asp

<xsl:choose>
  <xsl:when test="expression">
    ... some output ...
  </xsl:when>
  <xsl:otherwise>
    ... some output ....
  </xsl:otherwise>
</xsl:choose>

Error

error btm1044: Input validation error: Non Segment level : [3] Transaction Set Control Number Mismatch

Situation

I was doing a test map on a BizTalk map in Visual Studio.
The data was in XML format, not EDI, so I’m surprised it tried to enforce this error. I would expect that to only occur a Receive Pipeline with real EDI data.

Data

<ns0:X12_00401_204 xmlns:ns0="http://abc.com/X12/204">
	<ST>
		<ST01>204</ST01>
		<ST02>0007</ST02>
	</ST>

        <!-- middle part omitted --> 	

	<SE>
		<SE01>22</SE01>
		<SE02>0001</SE02>
	</SE>
</ns0:X12_00401_204>

Fix

In the example above, I had to set the SE02 value to the same as the ST02 value. Note also that SE01 should be the number of EDI segments, including the ST and SE segments.

NOTE: You would presumably get the same error if you drop an EDI file into a Receive Location with an EDI pipeline.

Problem

I started working with OAGIS schemas . The schemas are very complex and many have over 1000 entities (elements, attributes, sequences, etc…) inside.

Let’s say I’m doing a BizTalk map, and I want to see if the schema has a field called “RequestedDeliveryDate”. Or maybe I want to find all the dates. You can’t do that in BizTalk Schema editor. You can open the .xsd schema file itself in NotePad++ and search, but it’s hard to read, and hard to find out where you are in the hierarchy.

Solution

Write your own program to list out all the elements, attributes etc… with the hierarchy.

For now, I’m just using “Write-Host” statements, and when the script runs, I can copy/paste the output window to NotePad++. From there, I can do normal search commands, or even RegEx search commands.

I thought about making the script look for the element pattern, but to me, it was easier to do that in NotePad++.

Code

#
#  Author: Neal Walters 
#    Date: 1/17/2020 
#   Descr: Help find element/attribute names in the huge OAGIS schemas 
#


# Return the value of an attribute from a given xmlNode 
function GetXmlElementsAttributeValue($node, $AttributeName)
{
    # Try and get the node.
    #$node = Get-XmlNode -XmlDocument $XmlDocument -NodePath $ElementPath -NamespaceURI $NamespaceURI -NodeSeparatorCharacter $NodeSeparatorCharacter

    # If the node and attribute already exist, return the attribute's value, otherwise return null.
    if ($node -and $node.$AttributeName) { return $node.$AttributeName } else { return $null }
}

# Do the recursion, looping through all elements and their children 
function explodeSchema($passNode)
{
    $depthCounter = $depthCounter + 1 
    $indent = "*" * $depthCounter 

    #Write-Host "$indent explodeSchema depth=$depthCounter NodeNameAttrValue=$($passNode.Name)" 

    $nodes = $passNode.SelectNodes($xpathEl) 
    #Write-Host "nodes count=$($nodes.count)"

    
    foreach ($node in $nodes) 
    {
       $totalNodeCount = $totalNodeCount + 1 

       
       $NodeNameAttrValue    =  GetXmlElementsAttributeValue $node "name" 
       $ref                  =  GetXmlElementsAttributeValue $node "ref" 

       $showRef = "" 
       if ($ref.length -ge 1) 
       {
          $showRef = "ref=$ref "
       }
       
       if ($node.LocalName -eq "element" -or $node.LocalName -eq "attribute")
       {
           $showName = "$($node.LocalName)=$($node.Name)"
       }
       else 
       {
           $showName  = $node.LocalName
       }


       Write-host "$indent $showName $showRef  " 

       explodeSchema $node  #go recursive here 
    }
    $depthCounter = $depthCounter - 1 

}



cls
$schemaFilename = "C:\YourFolderName\YourSchemaName.xsd"

$xpathEl = "*[local-name()='element' or local-name()='complexType' or local-name()='group' or local-name()='sequence' or local-name()='attribute']"; 

[xml] $xmlDoc = Get-Content $schemaFilename 
$node = $xmlDoc.SelectSingleNode("//*")  #get us positioned on the root element 
Write-Host "Root Node:" + $node.Name 

$depthCounter = 0 
$totalNodeCount = 0 
explodeSchema $node 

Write-Host "totalNodeCount=$totalNodeCount"   #bug here, count shows zero?? 

Sample Output

Note: The number of asterisks tell you what depth you are in the hierarchy.

Root Node: + xs:schema
* element=X12_00401_204   
** complexType   
*** sequence   
**** element=ST   
***** complexType   
****** sequence   
******* element=ST01   
******* element=ST02   
******* element=ST03   
**** element=xs:element ref=B2   
**** element=xs:element ref=B2A   
**** element=xs:element ref=L11   
**** element=xs:element ref=G62 
...
* element=N2   
** complexType   
*** sequence   
**** element=N201   
**** element=N202   
* element=N3   
** complexType   
*** sequence   
**** element=N301   
**** element=N302 

What else could be done?

Ideas for the future include:
1. Showing the field type (string, int, decimal, etc…)
2. For EDI fields, could show other attributes such as N2 or R5
3. Showing min/max occurs
4. Add ability to find field containing a given lookup string (then show the parents when a match is found). For example, show all “date” elements/attributes.

Other keywords to help find this post:
1. Recursive XPath against an XSD schema
2. XSD Schema Recursion
3. Explode Schema element names using recursion
4. Extract all schema element names to a file using a script

If you’ve been around BizTalk for a while, like me, you most certainly have gotten this error many times.

The normal solutions are:
1) Make sure you schema is deployed
2) Make sure your schema is not deployed more than once
3) Set “AllowUnrecognizedMessage” to True in the XML Disassembler.
4) Make sure the file you dropped has the proper namespace and root element.

To review, in BizTalk MessageType is the TargetNamespace#RootElement.

Here was the exact message I got when I dropped my test file:

The Messaging Engine failed while executing the inbound map
 for the message coming from source URL:"e:\Integration\Inbound\CPChemNew\204\*.xml" 
 with the Message Type "http://abc.com/X12/204/CPChem2#X12_00401_204". 
 
Details:"Finding the document specification by message type "ST" failed. Verify the schema deployed properly. " 

The message type looked okay.
http://abc.com/X12/204/CPChem2#X12_00401_204

But what was this "ST" in the second message?

Solution

I was converting a regular schema to an envelope schema in order to accomplish debaching. "Body XPath" is a parm you set so that the receive XML pipeline will automatically split the message into multiple messages).

I put the "Body XPath" one element too low. It should have split on X12_00401_204, but actually split the message into an ST, and other similar segments.

Insights

Here's what I didn't understand. I'm not sure why, but after debatching, the XML Disasembler on the Receive Pipelines, decides that it needs to verify that the target schema exists as well. Thus, the error on "ST".

How I Solved It

1. I did a test map in Visual Studio and got the results of the test map.
2. And then dropped that into the appropriate Receive Location (different from the one above).
3. Since I had tracking turned on, I check the "Tracked Message Events", and saw multiple "Transmission Failed" and looking at the body of each, I see one file has as the root, one has as the root, and so on.
4. The other thing I should have noticed was that in the error, the message type should have been: "http://abc.com/X12/204/CPChem2#X12_00401_204Looping" instead of "http://abc.com/X12/204/CPChem2#X12_00401_204". That should have hinted to me that maybe the debatching was happening. X12_00401_204Looping and X12_00401_204 were so similar I didn't notice, so lesson learned here is don't assume and be exact.

The file I'm trying to build looks like this:

<ns0:X12_00401_204_LOOPING xmlns:ns0="http://abc.com/X12/204/CPChem2">
	<X12_00401_204>
           <ST> 
              etc... 

In EDI, a customer can send multiple 204s, but we wanted a single file with multiple 204s in it (too complex to explain here).

The next thing I need to do is make sure I have a scheme that matches , and I need to add a namespace to that as well. So more work to be done, but now I'm back on track. What I need is two schemas, and the Envelope schema can import and reference the other one.

What if you need to import a bunch of BizTalk vocabulary rules (.xml files) into the Biztalk BRE (Business Rule Engine)? You don’t want to do it one at time using the Business Rule Deployment Wizard. The real question, is why Microsoft didn’t provide a batch version of that tool that does all the same features, or incorporate those features into the BTTask command.

Microsoft used to provide a tool called ImportExportRuleStore.exe. A reference to this tool can be found on Tallan Blog.

When you search on Microsoft site or MSDN, you can no longer find it nor download it, although there are some .exe copies of it on the old CodePlex site (under the BiztalkBatchBuild project). However, I didn’t want to download an un-trusted .exe. Apparently Microsoft never provided the source for it.

BTDF (BizTalk Deployment Framework) surely has some way to do import rules as well, but I didn’t dig through it. If they do, it’s probably in an MSBuild file.

At least part of that utility now seems to be replaced with a tool called “Microsoft.Practices.ESB.RuleDeployer.exe” in the Microsoft ESB directory (if you have that installed).

You can specify parameters of “-v filename.xml” to import a vocabulary.

See related blog where I have a PowerShell script to batch import all vocabulary files in a directory.

While the GitHub says it’s only for BizTalk 2013-R2 and lower, it’s just C# code that should work with any version of BizTalk, unless BizTalk makes some breaking changes in future releases. But it’s unlikely they would do that, as many customers have pipeline components and business rules.

Visual Studio 2015 Builds

To get it to work with BizTalk 2016, download the code from GitHub, and open it with Visual Studio 2015. Open the solution called BREPipelineFramework.sln. For each of the four projection in that solution, right click “properties” and set the .NET framework to 4.6 as shown below.

NOTE: I had to rename BREPipelineFramework.PipelineComponent.BRPEPiplineFramework.Component, a ridiculously long name, to something shorter, such as: BREPipelineFramework.PipelineComponent. The compile was given me an issue about exceeding the max file size. So in other words, my GitHub repository was something like C:\users\myname\Source\Repos\BizTalkPipelineFramework, and the subfolder structure within that just made some of the names a little too long.

I will add any additional notes here as I work with installing on BizTalk 2016.

Run GACUtil on these .DLLS

I used my PowerShell version of GacUtil to deploy the following four DLLs:

$dllpath1 = "c:\Users\MyName\Source\Repos\BizTalkPipelineFramework\BREPipelineFramework\BREPipelineFramework\bin\Debug\BREPipelineFramework.dll"
GacDll $dllPath1 

$dllpath2 = "c:\Users\MyName\Source\Repos\BizTalkPipelineFramework\BREPipelineFramework\BREPipelineFramework.Helpers\bin\Debug\BREPipelineFramework.Helpers.dll"
GacDll $dllPath2 

$dllPath3 = "c:\Users\MyName\Source\Repos\BizTalkPipelineFramework\BREPipelineFramework\BREPipelineFramework.SampleInstructions\bin\Debug\BREPipelineFramework.SampleInstructions.dll"
GacDll $dllPath3

$dllPath4 = "c:\Users\MyName\Source\Repos\BizTalkPipelineFramework\BREPipelineFramework\BREPipelineFramework.PipelineComponent\bin\Debug\BREPipelineFrameworkComponent.dll"
GacDll $dllPath4 

Create a Test Application Import Pipelines Assembly

I created a new BizTalk Application called BizTalk.Common, and imported the resource/assembly:
c:\Users\nealw\Source\Repos\BizTalkPipelineFramework\BREPipelineFramework.BizTalk2016\BREPipelineFramework.TestProject\bin\x86\Debug\BREPipelineFramework.TestProject.dll

This project contains the .btp Pipeline files that could be used for testing.
I chose not to run the full BizUnit test, as I didn’t want to take the time to install BizUnit, at least or today.

I then created a Receive Port and Location, and I was able to see and pick the pipelines from the BREPipelineFramework.TestProject:

Import Vocabulary and Policies

Import the BizTalk Business Rule Vocabularies:

“c:\Users\MyName\Source\Repos\BizTalkPipelineFramework\BRE Artifacts\Vocabularies\BREPipelineFramework.JSONInstructions.1.0.xml”

When I tried this, I got the error:
“Unable to load assembly BREPiplineFramework.JSON”. Earlier, I had not compiled this because we had not immediate need to do JSON. So at this point, I went back and compiled it and GACed it (after setting it’s .NET framework to 4.6 and building it).

then Import the BizTalk Business Rule Policies:
“c:\Users\MyName\Source\Repos\BizTalkPipelineFramework\BRE Artifacts\Policies\BREPipelineFramework.JSONPolicy.1.0.xml”

But then I realized there was a whole directory of vocabularies to import (25 files in this directory):
“c:\Users\MyName\Source\Repos\BizTalkPipelineFramework\BREPipelineFramework\BRE Artifacts\Vocabularies\..”

So I did some research, and found the Microsoft.Practices.ESB.RulesDeployer.exe replacement for the old ImportExportRulestore.exe utility. If you don’t have the ESB Toolkit installed, then you might have to manually import all 25 files.

So I wrote a little PowerShell to loop through the files in the directory and call that utility:

#
#  Import an entire directory of vocabulary files into BizTalk BRE (Business Rule Engine) 
# 
cls
$sqlServer = "DLBizTalkDev1" #not sure this is supported in this utility 
$commandPath = "e:\BizTalkServer2016\ESB Toolkit 2.4\Bin\Microsoft.Practices.ESB.RulesDeployer.exe"

$vocabDir = "c:\Users\nealw\Source\Repos\BizTalkPipelineFramework\BREPipelineFramework\BRE Artifacts\Vocabularies"
$files = Get-ChildItem -Path $vocabDir -Filter "*.xml"

foreach($file in $files)
{
   $processedCounter = $processedCounter + 1 
   Write-Host "Filename=$($file.Name)"
   $cmdResults = &"$commandPath" -v $($file.FullName)
   Write-Host $cmdResults 
 }

I’m at a client that has the users “My Documents” on a share drive, so that when you logon to any computer, you have your “My Documents” available.

The question is, how to get that directory/path name?

Why? You might want to write a C# program to access one of the files there, or you might want the path to put in Total Commander as short cut.

There may be better methods, but this is what I did.

I opened PowerShell ISE, and typed in “Get-Location” as the text of my program. I then did a “File Save As”, navigated to the “My Documents” directory and saved it.

I then pressed F5 to run it. It actually failed, but the error at least gave me the path name, as shown below:

S C:\WINDOWS\system32> \\abcServerName\Users$\johndoe\My Documents\GetLocation.ps1
File \\abcServerName\Users$\johndoe\My Documents\GetLocation.ps1 cannot be loaded 
because running scripts is disabled on this system. For more information, see 
about_Execution_Policies at https:/go.microsoft.com/fwlink/?LinkID=135170.
    + CategoryInfo          : SecurityError: (:) [], ParentContainsErrorRecord 
   Exception
    + FullyQualifiedErrorId : UnauthorizedAccess


Amazon AWS re:invent reinvent 2019 – Andy Jassy CEO AWS – Notes 12/04/2019

Stats: 65,000 attendees, Public Sector Using: 10,000 Academic 25,000 Non Profits

AWS NHitro

Decided to design and build chips
Bought Annapurna Labs (Chip Designers Builders from Israel), and design build chips themselves.
Graviton ARM chip announced last year (A1 Instances)

Article about Amazon’s acquisition of Annapurna Labs

Version Graviton Chips announcement

40% better price performance than last generation X86 processors today.

3x throughput, 40% less cost than current best NVIDIA chips.
Coming to SageMaker in 2020.

AWS Fargate – Serverless – tell CPU and Memory, upload containers, it does the rest.
Super popular – new customers – 40% start with Fargate because easier to create containers. Didn’t work for EKS /Kubernates, but it will now.

Amazon Fargate for Amazon EKS

31:45 CEO of Goldman Sachs – is also a DJ David M Solomon
They have 9000 engineers trying to make finance work better.
Credit Card Business – they underpin AppleCard
150 year old company, new to consumer finance.
2020 – Launching Transaction Banking – built on AWS.
//.marquee – platform, APIs

Moving from Windows to Linux – has been happening for several years.
IDC estimates that in 2020 80% of workloads will be on Linux.
More vibrant community around Linux.

System Integrators – (called “SI’s”)
Moving enterprises to cloud done by
Companies like: Slomon and Rackspace.
Or born in the cloud SIs, Onaca, Clearscale
Lot’s of choices in partners in moving to the cloud.

Data Silos – Data Lakes – S3 as Data Lake – Announcing: Amazon S3 Access Points

REDSHIFT

concurrency scaling
materialized views
Lake House
Federated Queries across data stores
Daily Lake Export
Redshift 2x faster, 75% less expensive
Announcing: Redshift RA3 Instances with Managed Storage
Announcing AQUA – Advanced Query Accelerator
Allows Redshift run up to 10x faster
speeds up compression/decryption
work on data there on raw storage without having to move it.
Available mid 2020
UltraWarm – warm storage on steroids for Elasticsearch service

HashTag #DBFreedom – means the right tool for the job
IOT – Timeseries – announced TimeStream
crypto/ledger – QLDB
DocumentDB is MonoDatabase compatible. (JSON store)

Yet one was missing – Cassandra – hard to manage on premises…
Rollback was clunky, many operating on old versions.
Announcing: Amazon Managed Cassandra Service (compat 3.11 release)
They now support about 10 different database formats/styles/tools.

1:20:00 Machine Learning

Start with the right data store then the right machine learning.
Twice as much Machine Learning as anybody else.

CEO of Cerner – Brent Shafer

How using Machine Learning to improve healthcare.
Learning, predicting, preventing problems.

1:30:00

Three layers of Machine Learning

Bottom for experts comfortable with frame works:
TensorFlow, PyTorch, and mxnet
85% of TensorFlow in cloud runs on AWS
has a whole team dedicating to making that faster.
Roughly 20% faster than other platforms.
SageMaker – opens ML to more people
Groundtroop – year ago, easier to label data
marketplace algos
first refinforcement learning into a service
one click training
NEO – train once and compile everyplace on the edge
Announcing: Sagemaker Studio – development environment
– web-based IDE, collect code, data, notebooks, etc…
– make it easier ot build a model
– Announcing Sagebook Notebooks with elastic compute
– Announcing Sagemaker Experiments – build, tune, automatically
– Announcing Sagemaker Debugger – with feature prioritization, which features are having impact on your model.
– Hard to find the “Concept drift” – Announcing SageMaker Model Monitor (detects concept drift)
Unfilled promises of AutoML
Announcing Sagemaker Autopilot – training with no loss of visibility or control, trains 50 different models. Gives you a notebook/recipe for each model. Provides ranked leaderboard with accuracy of each.
1:48:45 Dr Matt Wood of Artificial Intelligence
Top Layer – Vision Recognition, Speech, Transcribe/Poly, OCR, NLP, etc…
Lex, Personalize and Forecasting
Announcing: Amazon Fraud Detector
Announcing Amazon CodeGuru (automatic code review, and identifies poorly performing code, provides assessment of your code down to the line, AWS best practices, concurrency issues, incorrect handling, identify correct input validation (example, improve 325% CPU utilization on Amazon Prime Day code)
People wanted analytics on their phone calls –
Announcement: Contact Lens for existing Amazon Connect
Detect positive/negative sentiment, long periods of silence,
people talking over each other, various searches, dashboards,
Adding real-time in 2020.

Announcing Amazon Kendra – new enterprise search with machine learning (NLP) – explained by Dr Matt Wood 2:10:10
Identify data, Provide FAQs, Sync and Index
Example Questions you can ask: Where is the IT Support desk? What time is it open?

Past? VMWare on AWS.
What about apps still on premises.
Announcing AWS Outputs – AWS infrastructure on premises
Can use Native AWS interface (now) or VMWare Cloud on AWS (2020)

AWS Local Zones – infrastructure deploy that places computer, storage, database services close to large cities. Starting today in LA by invitation.

Mobile Users/Connected Devices – How to leverage 5G

2;26:40 CEO of Verizon – Hans Vestberg
“5G Edge”

AWS Wavelength – embedded at the edge of 5G
Embedded AWS infrastructure – fewer hops to get to the device.