Use the Oracle PL/SQL “Extract” function to extract the Year, Month, Day into separate columns for sorting/grouping:

select 
   EXTRACT(YEAR FROM Creation_Date) as Year, 
   EXTRACT(MONTH FROM Creation_Date) as Month, 
   EXTRACT(DAY FROM Creation_Date) as Day, 
   Creation_Date 
from PO.PO_HEADERS_INTERFACE  POI

Date fields in Oracle hold the time, but by default, the time is not displayed.

Below is a sample of how to format the date with time:

   TO_CHAR(CREATION_DATE, 'YYYY-MM-DD HH24:MI:SS') as CREATION_DATE

Example:

select Creation_Date, TO_CHAR(CREATION_DATE, 'YYYY-MM-DD HH24:MI:SS') as CREATION_DATE_Fmt 
from PO.PO_HEADERS_ALL
where Creation_Date is not null 
order by Creation_Date desc 

Notice how the time does not appear by default in the first field displayed:

Here’s a sample of how I GAC and UnGAC DLLs from Powershell. I think I got this on StackOverflow, but don’t have the link handy now.
(GAC, of course, refers to the .NET Global Assembly Cache. All BizTalk artificats and C# programs must be in the GAC to be called from a BizTalk orchestration).


function GacDll ($dllpath)
{
    Write-Host " "  # blank line 
    if (!(Test-Path $dllpath)) 
       {

          Write-Host "`n dllpath not found: $dllpath  `n`n" 
          exit
       }
    [System.Reflection.Assembly]::Load("System.EnterpriseServices, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a")            
    $publish = New-Object System.EnterpriseServices.Internal.Publish            
    $publish.GacInstall($dllpath)
    Write-Host "Published to GAC: $dllpath" 

}

function unGacDLL ($dllpath) 
{
    if (!(Test-Path $dllpath)) 
       {

          Write-Host "`n dllpath not found: $dllpath  `n`n" 
          exit
       }
    [System.Reflection.Assembly]::Load("System.EnterpriseServices, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a")            
    $publish = New-Object System.EnterpriseServices.Internal.Publish            
    $publish.GacRemove($dllpath) 
    Write-Host "Removed from GAC: $dllpath" 
  
}

cls

$dllpath1 = "c:\VSTS\ABC.Oracle\ABC.OracleDB.Helpers\ABC.OracleDB.Helpers\bin\Debug\ABC.OracleDatabase.Helpers.dll"
UnGacDll $dllPath1 

$dllpath2 = "c:\VSTS\ABC.BizTalkNonDB.Helpers\ABC.BizTalkNonDB.Helpers\bin\Debug\ABC.BizTalkNonDB.Helpers.dll"
GacDll $dllPath2 
 
$date = Get-Date -Format "yyyy-MM-dd hh:mm:ss"
Write-Host "`nDate/Time Completed: $date "

Error in the Execution Result

Response:
{
  "errorType": "Runtime.ImportModuleError",
  "errorMessage": "Error: Cannot find module 'decode-verify-jwt'",

Response:
{
  "errorType": "Runtime.ImportModuleError",
  "errorMessage": "Error: Cannot find module 'decode-verify-jwt'",

and also the following text appears:

    
   Lambda can't find the file decode-verify-jwt.js. Make sure that your handler upholds the format: file-name.method.

Solution

I uploaded the file by using a zip. I zipped the parent directory, so my starting module was not in the root directory, but one level under that.

So there are two ways to fix it:
1) Include the parent directory on the “Handler”.
example parentDirectoryName/decode-verify-jwt.handler

2) Rebuild your zip without the parent directory and upload the zip again.

Overall, it’s a mismatch between the handler and the code in your directory structure.

I needed to implement RawString functionality to be able to read a non-XML file into an orchestration. Scott Colestock had a great article on this on his TraceOfThought blog, but when I needed it the other day, it had been hacked (hopefully only temporary).

1. Deploy the C# code towards the bottom of this log; it is not a pipeline, so can be added to new class to an existing C# helper library you might have, or you can put it in its own assembly.

2. Add a reference to this C# Assembly in your orchestration.

3. In your orchestration, create a multipart message type. It will have only one part, but that will be the RawString class from the code below.

4. In the orchestration, create a nessage of type System.Xml.XmlDocument, and assign this to your cativating receive. But wait Neal, I thought you said we wanted to receive a non-XML message. Trust me, this will work.

5. In the orchestration, add a Message Assignment shape and code similar to below:

//convert the message received to a new message with a multiple-part-type of RawString 
msg_rcv_RawString.MessagePart_1 = msg_XML_Doc; 

6. At this point, you can access the message as a string, for example:

varText = msg_rcv_RawString.MessagePart_1.ToString();

Voila – you have now have the text of the file in a variable called varText. You can then parse it, or do whatever you want with it.

The code below needs to be in a C# helper class that you can access within your orchestration. (Obviously give it a strong name and GAC it.) I can’t remember which site I got this code from, but I think it’s out there on several.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Xml;
using System.Xml.Serialization;
using System.Runtime.Serialization;
using System.IO;
using Microsoft.XLANGs.BaseTypes;

namespace REI.BizTalkNonDB.Helpers
{

    public abstract class BaseFormatter : IFormatter
    {
        public virtual SerializationBinder Binder
        {
            get { throw new NotSupportedException(); }
            set { throw new NotSupportedException(); }
        }

        public virtual StreamingContext Context
        {
            get { throw new NotSupportedException(); }
            set { throw new NotSupportedException(); }
        }

        public virtual ISurrogateSelector SurrogateSelector
        {
            get { throw new NotSupportedException(); }
            set { throw new NotSupportedException(); }
        }

        public abstract void Serialize(Stream stm, object obj);
        public abstract object Deserialize(Stream stm);
    }


    public class RawStringFormatter : BaseFormatter
    {
        public override void Serialize(Stream s, object o)
        {
            RawString rs = (RawString)o;
            byte[] ba = rs.ToByteArray();
            s.Write(ba, 0, ba.Length);
        }

        public override object Deserialize(Stream stm)
        {
            StreamReader sr = new StreamReader(stm, true);
            string s = sr.ReadToEnd();
            return new RawString(s);
        }
    }

    [CustomFormatter(typeof(RawStringFormatter))]
    [Serializable]
    public class RawString
    {
        [XmlIgnore]
        string _val;

        public RawString(string s)
        {
            if (null == s)
                throw new ArgumentNullException();
            _val = s;
        }

        public RawString()
        {
        }

        public byte[] ToByteArray()
        {
            return Encoding.UTF8.GetBytes(_val);
        }

        public override string ToString()
        {
            return _val;
        }
    }


}

My goal here was to send myself an email of all new files received from vendors. This is assuming the files are not picked up and processed by BizTalk automatically. If that were the case, you might be able to point the folders to your archive directories and still achieve the same results.

In my particular case, we have a production serer exposed to our vendors. They send us test files, and I have to manually copy them and process them on our test server. At this time, we don’t have any access from one server to the other. So the email alert will trigger me to go process the files, rather than waiting for an email from the vendor.

#
#      Name: Neal Walters
#      Date: 2019/07/29 
#
#   Purpose: Send email with all any new test files received. 
#            Files are received on Prod server that is exposed to outside world. 
#            We have to manually copy these files to our test system and process them. 
#
#

cls

$minutes = $args[0]

#allow this to be run 
if ($minutes -eq $null)
   {
     $minutes = 60
   }
   

$filterDate = (Get-Date).AddDays(-6) 
Write-Host "filterDate=$filterDate" 

$folderMask = "c:\BizTalk\Vendors\*\AS2FilesReceived\*.*"
#-Exclude "*997*.*"
$files = Get-ChildItem -Path $folderMask -File | Where-Object {$_.CreationTime -gt (Get-Date).AddMinutes($minutes)} | Sort-Object FullName
Write-Host "Number of Matching AS2 Files = $($files.Count)" 

$folderMask = "f:\SFTPFiles\*\*Test\From*To*\*.*"
$FTPfiles = Get-ChildItem -Path $folderMask -File | Where-Object {$_.CreationTime -gt (Get-Date).AddMinutes($minutes)} | Sort-Object FullName
Write-Host "Number of Matching SFTP Files = $($FTPfiles.Count)" 


$fileCount = 0 
$HTMLmessage = "<Table border=1>"

foreach ($file in $files) 
{
     $fileCount = $fileCount + 1 
     Write-Host "$fileCount $($file.FullName) $($file.CreationTime)`n" 
     $HTMLmessage = $HTMLMessage + "<tr><td>$fileCount $($file.FullName)</td><td> $($file.CreationTime)</td></tr>`n" 

}

foreach ($file in $FTPfiles) 
{
     $fileCount = $fileCount + 1 
     Write-Host "$fileCount $($file.FullName) $($file.CreationTime)`n" 
     $HTMLmessage = $HTMLMessage + "<tr><td>$fileCount $($file.FullName)</td><td> $($file.CreationTime)</td></tr>`n" 

}

$HTMLmessage = $HTMLMessage + "</Table>"

#[string[]] $toEmails = "nwalters@rogent.com", "bshaw@rogent.com" # List of toEmails to email your report to (separate by comma) or use a distribution list 
[string[]] $toEmails = "nwalters@rogent.com" # List of toEmails to email your report to (separate by comma) or use a distribution list 

$smtpServer = "smtp.rogent.com" 
$fromemail = "noreplyFilesReceived@rogent.com"
$emailSubject = "BizTalk EDI Files Received"

if ($fileCount -gt 0)
{
   send-mailmessage -from $fromemail -to $toEmails -subject $emailSubject  -BodyAsHTML -body $HTMLmessage -priority High -smtpServer $smtpServer
   $traceDateTime = get-date
   Write-Host "$traceDateTime Email sent to: $toEmails (size in bytes=$($HTMLmessage.length))" 
}



How to get files created in the most recent x days.

$days = -3 
$folderMask = "c:\BizTalk\Vendors\*\AS2FilesReceived\*.*"
$files = Get-ChildItem -Path $folderMask -File | Where-Object {$_.CreationTime -gt (Get-Date).AddDays($days)} | Sort-Object FullName
Write-Host "Number of New AS2 Files = $($files.Count)" 

First run npm to install js-sha3 (See GitHub https://github.com/emn178/js-sha3
This example show how to do it without using web3 library, in case you don’t have an ETH node to connect to, or can’t get the library working.


npm install js-sha3

Subroutines below come from either agove GitHub or https://ethereum.stackexchange.com/questions/1374/how-can-i-check-if-an-ethereum-address-is-valid

const keccak256V = require('js-sha3').keccak256;
const sha3_256 = require('js-sha3').sha3_256;

function toChecksumAddress (address) {
  address = address.toLowerCase().replace('0x', '')
  var hash = keccak256V(address);
  var ret = '0x'

  for (var i = 0; i < address.length; i++) {
    if (parseInt(hash[i], 16) >= 8) {
      ret += address[i].toUpperCase()
    } else {
      ret += address[i]
    }
  }

  return ret
}

/**
 * Checks if the given string is an address
 *
 * @method isAddress
 * @param {String} address the given HEX adress
 * @return {Boolean}
*/
var isAddress = function (address) {
    if (!/^(0x)?[0-9a-f]{40}$/i.test(address)) {
        // check if it has the basic requirements of an address
        return false;
    } else if (/^(0x)?[0-9a-f]{40}$/.test(address) || /^(0x)?[0-9A-F]{40}$/.test(address)) {
        // If it's all small caps or all all caps, return true
        return true;
    } else {
        // Otherwise check each case
        return isChecksumAddress(address);
    }
};

/**
 * Checks if the given string is a checksummed address
 *
 * @method isChecksumAddress
 * @param {String} address the given HEX adress
 * @return {Boolean}
*/
var isChecksumAddress = function (address) {
    // Check each case
    address = address.replace('0x','');
    //var addressHash = sha3(address.toLowerCase());
	var addressHash = sha3_256(address.toLowerCase());
    for (var i = 0; i < 40; i++ ) {
        // the nth letter should be uppercase if the nth digit of casemap is 1
        if ((parseInt(addressHash[i], 16) > 7 && address[i].toUpperCase() !== address[i]) || (parseInt(addressHash[i], 16) <= 7 && address[i].toLowerCase() !== address[i])) {
            return false;
        }
    }
    return true;
};

var originalAddress = "0x41b418a9bea5c6652a5fb6674370126e828b50fe"; 
var checkSumAddress = toChecksumAddress(originalAddress); 
console.log("checkSumAddress=" + checkSumAddress);

var isAddressResult = isAddress(originalAddress); 
console.log("isAddressResult=" + isAddressResult); 

You can validate the address here:
https://tokenmarket.net/ethereum-address-validator

The following is code I copy and paste a lot into my various Powershell prorams.

The commented out line creates a backup zip file, so I like to have the file name with the date/time in it.

$fmtDateTime = $(get-date -f "MM/dd/yyyy HH:mm:ss")
Write-Host "Start Time: $fmtDateTime"

$fileDateTime = $(get-date -f "MM_dd_yyyy__HH_mm_ss")
Write-Host "File DateTime: $fileDateTime"

$filename = "C_Temp_$fileDateTime" 

#.zip suffix is automatically added 
#Compress-Archive -Path C:\Temp -DestinationPath f:\Backup\TempDir\$filename

Use the queries to find database tables for an owner (namespace/schema), tables containing certain phrases, all columns in a table, or column containing a certain phrase.

-- All tables for an owner (prefix before table name) 
SELECT owner, table_name FROM all_tables where owner = 'PO';

-- Find table names 
SELECT owner, table_name FROM all_tables where Table_Name like '%INVOICE%INTER%' order by Owner, Table_Name 
SELECT owner, table_name FROM all_tables where lower(Table_Name) like '%pos_shipping_addresses%' order by Owner, Table_Name 

-- Find column names (along with data type and length/size) 
SELECT table_name, column_name, data_type, data_length from dba_tab_columns where lower(column_name) like '%ship%' and upper(table_name) = 'PO_HEADERS_INTERFACE'