Often, after implementing a new system or application on BizTalk, we need to provide a report to management of how many files were processed.

If you keep an archive of all files in or out, you can use that with a simple PowerShell to accomplish the report. Here’s an example I found:

PowerShell Example 1 – Files by Date

$count = @{}
$size  = @{}

get-childitem d:\BizTalk\MyApp\Archive\EDI850Order\*.txt |
 foreach {
           $date = $_.lastwritetime.tostring('dd-MMM')
           $count[$date]++
           $size[$date] += $_.length
         }
 $count.keys |
  sort|
  foreach {
           [PSCustomObject]@{
            Date = $_
            'Number of files' = $Count[$_]
            Size = $Size[$_]
           } 
         } | format-table -AutoSize

Sample Results of Above Powershell

Date   Number of files   Size
----   ---------------   ----
08-Sep              14  37263
11-Sep              19  53761
12-Sep               7  26984
13-Sep              45 147575
14-Sep              44 154050

Example 2 – Multiple Folders – Files by App and Date

#################################################################
#
# 
# Neal Walters - 09/25/2017
# Script: FileCountByDateAllDirs.ps1 
# Purpose: Get a summary report of files processed 
#  by each app on each date 
#  based on archive files 
#
#################################################################
cls 
$count = @{}
$size  = @{}

$items  = @(Get-ChildItem 'd:\BizTalk\EDIHorizon\Archive\EDI850Order' -r)
$items += @(Get-ChildItem 'd:\BizTalk\ZLien\ArchiveOutbound' -r)

$items |
 foreach {
           $posFirstSlash  = $_.fullname.indexOf("\") 
           $posSecondSlash = $_.fullname.indexOf("\", $posFirstSlash+1) 
           $posThirdSlash  = $_.fullname.indexOf("\", $posSecondSlash+1) 
           $lenApp = $posThirdSlash - $posSecondSlash - 1 
           # Write-Host $app $posSecondSlash $posThirdSlash $_.fullname 
           $date = $_.lastwritetime.tostring('yyyy-MMM-dd')
           $app = $_.fullname.substring($posSecondSlash+1,$lenApp)
           $key = $app+" "+$date
           $count[$key]++
           $size[$key] += $_.length
         }

 $count.keys |
  sort|
  foreach {
           [PSCustomObject]@{
            App_Date = $_
            'Number of files' = $Count[$_]
            Size = $Size[$_]
           } 
         } | format-table -AutoSize

Sample Results of Above Powershell

App_Date         Number of files     Size
--------         ---------------     ----
App1 2017-Sep-18              35   119159
App1 2017-Sep-19              32   105811
App1 2017-Sep-20              35   116315
App1 2017-Sep-21              12    34450
App1 2017-Sep-22              24    85952
App2 2017-Sep-20               2 15460798
App2 2017-Sep-21               2 15457187
App2 2017-Sep-22               2 15472012

With the SFTP ports in BizTalk 2013, we are using the log function. BizTalk appends each SFTP log to the end of the existing log file.
So I created a Powershell Script to take an array of directory names, and run the same re-run and delete logic for each.
I also include the logic to delete a file over a certain retention period (see also shorter code for that in prior post: “Powershell to Delete Old Files“).


#################################################################
#
#
# Neal Walters - 09/05/2017
# Script: RenameSFTPLogsToAddDate.ps1
# Purpose: Rename a file such as "receive_internal.txt" to
# "receive_internal_2017_09_01.txt"
# so that each day has it's own separate file of SFTP messages.
# Also purge files over $retentionDays
# Use Task Scheduler to schedlue this script
# to run once late a night or early morning.
#
#################################################################

cls
$retentionDays = 30
$formatDateTime = get-date -f _yyyy_MM_dd__HH_mm_ss
$formatDate = get-date -f _yyyy_MM_dd
Write-Host “formatDateTime= $formatDateTime”
Write-Host “——————————————————-”
$renameCount = 0
$skipCount = 0
$deleteCount = 0

# Array of Directory Names (logic below will process each directory you specify here)
$DirNames = “d:\BizTalk\App1\SFTPLogs\”,
“d:\BizTalk\App2\SFTPLogs\”

Foreach ($DirName in $DirNames)
{
Write-Host “DirName=$DirName”
Get-ChildItem $DirName -Filter *.txt |
Foreach-Object
{

$fullname = $_.FullName.ToString();
$dirname = $_.Directory.ToString();
$filename = $_.Name.ToString();
$filenameOnly = [io.path]::GetFileNameWithoutExtension($filename)
$ext = [io.path]::GetExtension($filename) # this includes the ., for example .txt

#If filename contains an underscore, then we think it has a date in it.
#If no date found, we want to rename the file, otherwise leave it as it was.

# looking for 20 as the century… part of date 2017, 2018, 2019 etc…
# if (-Not ($filenameOnly.toString().Contains(“_20”) ) )
if (-Not ($filenameOnly -Match “_20”) )
{
Write-Host “OldName $fullname”
$content = Get-Content $_.FullName
$formatDate = get-date -f _yyyy_MM_dd

Write-Host “Filename=$filename”
$newFileName = $dirname + “\” + $filenameOnly + $formatDate + $ext
Write-Host “NewName $newFileName`n”
Rename-Item $fullname $newFileName
$renameCount++
}
else
{
#Write-Host “Skipping file=$filenameOnly because no match”
$skipCount++
}

#purge files over retentionDays days old
if ($_.LastWriteTime.AddDays($retentionDays) -lt (GET-DATE) -and -not $_.psiscontainer)
{
REMOVE-ITEM $_.fullname -force
$deleteCount++
}
} #end of Foreach-Object
Write-Host “Number of Files Renamed = $renameCount”
Write-Host “Number of Files Skipped = $skipCount (for rename)”
Write-Host “Number of Files Deleted = $deleteCount”
Write-Host “——————————————————-”
}

The one thing I didn’t account for (yet), is the case where different files can have different EDI delimiters. Technically, you should look for the end of the ISA segment to find the delimiters, and use those in the RegEx match. For now, I’m assuming the field delimiter is * and the segment delimiter is the tilda (~).

Requirement

I was archiving the EDI files in BizTalk with the filename set to “%datetime%_%MessageID%_EDI.txt”. I decided it would be better to name the files COMPANYNAME_DOCTYPE_ORDERNO_ORDERDATE_%datetime%_%MessageID%_EDI.txt.
NOTE: I could have done this logic in a custom C# BizTalk Pipeline, but decided to do it after the fact with a more simple Powershell than would be easier for administrative staff to maintain and update.

Sample 1 – Just test the parsing

With this sample, you can copy the contents of a file into the $ediText string, and test.

cls
#Note subsituted " with `" in the string to escape the quotes within quotes issue 
$ediText = "ISA*00*          *00*          *ZZ*MYCUSTOMER*ZZ*MYCOUNTRY*170823*1610*U*00401*000000117*0*T*:~GS*PO*BTS-SENDER*RECEIVE-APP*170823*1610*117*T*00401~ST*850*0117~BEG*00*NE*391949**20170828~N1*BY*DELIVERY-ADDRESS~N1*ST*DELIVERY-ADDRESS~N3*1420 MAINSTREET DR~N4*DALLAS*TX*12345~PO1*1*5.00*EA*4.350**IN*106889~PID*F****SAND MIX ( SSM80 )~PO1*2*1.00*etc...~"; 

$CompanyID  = [regex]::match($ediText,'.*ISA\*.*?\*.*?\*.*?\*.*?\*.*?\*(.*?)\*.*').Groups[1].Value
$OrderNum   = [regex]::match($ediText,'.*BEG\*.*?\*.*?\*(.*?)\*.*').Groups[1].Value
$OrderDate  = [regex]::match($ediText,'.*BEG\*.*?\*.*?\*.*?\*.*?\*(.*?)[~\*].*').Groups[1].Value
$EdiDocType = [regex]::match($content,'.~ST\*(.*?)[~\*].*').Groups[1].Value

Write-Host "CompanyID = $CompanyID"; 
Write-Host "OrderNum = $OrderNum"; 
Write-Host "OrderDate= $OrderDate"; 
Write-Host "EdiDocType= $EdiDocType"; 

Sample 2 – Renaming Files Based on EDI Key Fields

cls

$DirName = "d:\BizTalk\EDIHorizon\Archive\EDI850Order\"

#only rename files that start with the year, 2017, 2018, etc...  thus 20*.txt 
Get-ChildItem $Dirname -Filter 20*.txt | 
Foreach-Object {

    $fullname = $_.FullName.ToString();  
    $dirname = $_.Directory.ToString(); 
    $filename = $_.Name.ToString(); 

    Write-Host "OldName $fullname"
    $content = Get-Content $_.FullName

    $CompanyID  = [regex]::match($content,'.*ISA\*.*?\*.*?\*.*?\*.*?\*.*?\*(.*?)\*.*').Groups[1].Value
    $OrderNum   = [regex]::match($content,'.*BEG\*.*?\*.*?\*(.*?)\*.*').Groups[1].Value
    $OrderDate  = [regex]::match($content,'.*BEG\*.*?\*.*?\*.*?\*.*?\*(.*?)[~\*].*').Groups[1].Value
    $EdiDocType = [regex]::match($content,'.~ST\*(.*?)[~\*].*').Groups[1].Value
    Write-Host "$OrderNum $OrderDate"

    Write-Host "Filename=$filename"
    $newFileName = $dirname + "\" + $CompanyID + "_" + $EdiDocType + "_" + $OrderNum + "_" + $OrderDate  + "_" + $filename
    Write-Host "NewName $newFileName`n" 
    Rename-Item $fullname $newFileName 
}

Having a filename like this will make it faster to search the archives for certain types of orders or files from a certain partner, or do do quick counts, based on the filename alone. For example, how many files did we get from XYZ company yesterday and today? This could be done in BizTalk with BAM as well, but my current client opted out of the overhead and complexity of BAM, especially since BizTalk was (for the most part), just passing the files around, not creating them.

The variable $EdiDocType above represents something like and 850, 855, 856, 810, 997, etc…

I might add one more feature. Many of the trading partner don’t use name, but some Dun number, phone number, or other ID number. I might have a lookup table to translate the code to a shortname that represents that trading partner.

In a previous article, we discussed how to use a Powershell job to delete files over a certain age.

But what if if you want to do it using straight “pure” DOS .bat or .cmd file?  The following shows you how it’s done:

REM Cleanup all files more than 7 days old
e:
cd E:\BizTalk\App\Backup
forfiles /S /M *.* /D -7 /C "cmd /c del @path"

NOTE however, UNC Paths are not supported. You CANNOT even specify UNC name to a remote server in the PATH parameter.

REM Cleanup all files more than 14 days old
forfiles /P \\server\BizTalk\App|Backup /S /M *.* /D -14 /C "cmd /c del @path"

The “forfiles” command is documented here, and
briefly recapped below:
.

It selects and executes any command on a file or set of files.

 

  • With forfiles, you can run a command on or pass arguments to multiple files. For example, you could run the type command on all files in a tree with the .txt file name extension. Or you could execute every batch file (*.bat) on drive C, with the file name “Myinput.txt” as the first argument.
  • With forfiles, you can do any of the following:
    • Select files by an absolute date or a relative date by using the /d parameter.
    • Build an archive tree of files by using variables such as @FSIZE and @FDATE.
    • Differentiate files from directories by using the @ISDIR variable.
    • Include special characters in the command line by using the hexadecimal code for the character, in 0xHH format (for example, 0x09 for a tab).

 

I recently showed a VBScript to Archive/Move xml files to a subfolder.  This is often needed when you have been archive or storing 1000s of XML files, and the the directory/folder is very slow to open due to the large number of files.  Now, we will do it in Powershell.

Create the subfolders ahead of time. With some minor improvements we could do that in the code, but I was in a hurry today when I needed this…

Get-ChildItem "201604*.xml" | ForEach { move -path $_ -destination ($_.directoryname +"\Archive\2016_04\"+ $_.Name)}

This does the following:
1) Select all files starting with “201604” (in my case, the files began with yyyymmdd.
2) pipe that into a ForEach loop
3) Run the “Move” commandlet
4) the filename in the loop is the $_ symbol
5) Then you build the destination directory $_ again is the iterator of the loop, i.e. the FileName object, so we can get it’s directory and “Name”. There, we insert the 2016_xx for the month.

So yes, you can make this a lot fancier, but it’s a start…

Business Problem/Scenario

I had a .bat file with 50 lines or more, and many of them had disk paths. We were migrating this to production, so I did “replace all” commands to change all the paths to production SAN/Server names. But then, I knew some of the paths existed, and some didn’t. So I wanted to find all the paths that didn’t exist, so either:
1) I could fix the filename, or
2) Create the path on the disk

So I needed to parse the file looking for file/path names. At first I tried RegEx, but then decided that just using “Split” was faster in my case. (Sometimes you just want to get the job done in the shortest amount of time.)

The following works when you have a prefix on each directory path. I’m sure there are variations you could make on this depending on your filenames. I’m only looking for lines that have .exe, because the .bat file is running various C# program to process the files.

Sample file Test.bat:

line1 Small.exe \\MyServer\Messages\Dir1 and more words
line2 Biggertest2.exe \\MyServer\Messages\Dir1 parm2 \\MyServer\Messages\Dir2 parm4

Sample Powershell Code:

$filename = "c:\Users\MyName\Documents\Powershell\Test.bat"
$linesOfFile = Get-Content $filename 
$pathPrefix = "\\MyServer" 
cls
foreach ($line in $linesOfFile) 
  {
     #Write-Host $line 
     if ($line.Contains(".exe")) 
       {
          #Write-Host 
          #Write-Host $line

          $tokens = $line -split " "
          foreach ($token in $tokens) 
            {
                if ($token.Contains($pathPrefix)) 
                    {
                       #Write-Host $token 
                       if (-Not (Test-Path $token))
                          {
                             Write-Host "Not Found: $token "
                          }
                       else 
                          {
                             #Write-Host "Found: $token "
                          }

                    }
            }

       }
  }

Results (Output):

Not Found: \\MyServer\Messages\Dir1
Not Found: \\MyServer\Messages\Dir1
Not Found: \\MyServer\Messages\Dir2

Subsequent Improvements:

Make the whole line upper case. Ignore lines that start with “REM” (remarks/comments).
Future enhancement, could also make sure that the .exe files exist.

     #before loop
     $pathPrefix = $pathPrefix.ToUpper() 

     #inside loop 
     $line = $line.ToUpper()  
     if ($line.Contains(".EXE") -and -not($line.StartsWith("REM"))) 

With BizTalk, we often archive files in a sub directory. Personally, I would rather archive to a SQL database with a database column, but that takes a little more architecture and sales. So in the meantime, many clients continue to write files to disk. There are frequently clean-up jobs that delete files over x days old.

However, when there are multiple thousands of a files in any directory, it can take a long time to open that directory, and display the files, especially when you are accessing it from a remote computer.

The $DaysBack parameter can be set to non-zero if you want to only group files over x days old into sub-directories.

<code>
#Move files (for example BizTalk archive XML files) 
#into subfolders based on date 
cls

$SourceDir = "C:\TestFiles\"
$DestinationDir = "C:\TestFiles\"
$DaysBack = 0 

$files = get-childitem $SourceDir *.* 
Write-Host "File COunt= $($files.Count)"

foreach ($file in $files) 
{
    $NewSubDirectory = $DestinationDir + "" + $file.LastWriteTime.Date.ToString('yyyy-MM-dd')

    #Create $NewSubDirectory if it doesn't already exist 
    if (!(Test-Path $NewSubDirectory))
	    {
	    New-Item $NewSubDirectory -type directory
	    }

    if ($DaysBack -gt 0)
       {
       If($file.LastWriteTime -lt (Get-Date).adddays(-1 * $DaysBack).date)
          {
	       Move-Item $file.fullname $NewSubDirectory
           }
       }
    else 
       {
    	   Move-Item $file.fullname $NewSubDirectory
       }
}
</code>

 

You will need full write/rename access to run this script. You can specify a UNC name in the $SourceDir and $DestinationDir variables.

Code above based on code sample found here http://www.marcvalk.net/2012/06/powershell-moving-files-into-subfolder-based-on-date/

Before – could be thousands of files

Powershell_MoveFilesSubDir_Before

After – Files in neat subdirectories by date

Powershell_MoveFilesSubDir_After

Sometimes you need to mass replace all the text string for all files in a directory, or at least all files matching some file mask.

Here’s a quick sample that I put together.

As a BizTalk consultant, I deal with data coming in from customers or trading partners. Sometimes, that data needs to be scrubbed. We were doing multiple rounds of testing, and the trading partner was going to put a fix in place in a few days, but in the meantime, I was having to hand edit each and every file manually, before putting the file into BizTalk. I was really getting tired of that process and wrote the script below.

# Fix various issues in the certain EDI files 
cls 
$path = "c:\MyPath\"
$files = get-ChildItem $path -filter "850*.edi" 
#$files  #use this to just display all the filenames 
foreach ($file in $files) 
{
   Write-Host "`n`nfixing file= $($file.Name)"
   $filetext = Get-Content $file.FullName -Raw   
   # The -Raw (line above) option brings all text into a string, without dividing into lines 
   Write-Host "Old Text in file: $($file.Name)" 
   Write-Host $filetext

   #example of regular text 
   $filetextNew = $filetext     -replace "Texas",        "TEXAS"
   #example of changing EDI tags
   $filetextNew = $filetextNew  -replace "\^WACO", "\^DALLAS"
   $filetextNew = $filetextNew  -replace "\^EXCELLENT",   "\^EU"
   $filetextNew = $filetextNew  -replace "\^MODUSE",      "\^MU"
   $filetextNew = $filetextNew  -replace "\^LIGHTUSE",    "\^LU"
   $filetextNew = $filetextNew  -replace "\^HEAVYUSE",    "\^HU"

   Write-Host "NEW:" 
   Write-Host $filetext 
   Set-Content $file.FullName $filetextNew
}

I was modifying an EDI file, so I wanted to make sure that the string I was modifying started with the caret symbol. So for example, I really wanted to change “^MODUSE” to “^MU”. Since that caret symbol has special meaning in RegEx (Regular Expressions), I had to put the backslash in front of it as an escape character. So I added the first line to change “Texas” to “TEXAS” to show that the backslahs and caret symbol are not needed for normal text replacement.

The one bug I had in the code above was specifying $file.Name instead of $file.FullName. It almost drove me crazy. It seemed to be returning the filename itself, rather than the contents of the file; probably because that file didn’t exist in the current directory in which the PowerShell script itself was running.

In the past, I used to use a utility called “BK-Replace’Em”, which is now called by the more generic name “Replace Text”. You can download a free copy from EcoByte here. It can do the same thing as above, without writing any code. The only thing is you need to be able to download and install it on your server, and I didn’t want to do that on the various servers that I’m currently working on.

I was playing with SQLPS for the first time and wanted to save my samples for future use.

Code

cls 
Write-Host "Start" 
Import-Module SQLPS –DisableNameChecking
Write-Host "Done with Import-Module" 
cd SQLSERVER:\
DIR

Write-Host "`n`nList of Instances (not working?)" 
cd SQLSERVER:\SQL\localhost 
Get-ChildItem | Select instancename  

Write-Host "`n`nList of Databases on Local Server default instance" 
cd SQLSERVER:\sql\localhost\DEFAULT\Databases   # specify "DEFAULT" if you have no instance name 
Get-ChildItem | Select name 

Write-Host "`n`nAlternate List of Databases on Local Server default instance" 
Invoke-SQLcmd -Server '.' -Database master 'select name, database_id, create_date from sys.databases' | Format-Table


Write-Host "`n`nList of tables in NealDemo Database" 
cd SQLSERVER:\sql\localhost\DEFAULT\Databases\NealDemo\Tables
Get-ChildItem 

Write-Host "`n`nRun some SQL Command" 
Invoke-Sqlcmd -Query "SELECT @@VERSION, db_name();"


#$sqlpath = "SQLSERVER:\sql\localhost\DEFAULT\Databases`nealDemo\Tables"
#dir
Write-Host "End" 

Output

Start
Done with Import-Module

Name            Root                           Description                             
----            ----                           -----------                             
DAC             SQLSERVER:\DAC                 SQL Server Data-Tier Application        
                                               Component                               
DataCollection  SQLSERVER:\DataCollection      SQL Server Data Collection              
SQLPolicy       SQLSERVER:\SQLPolicy           SQL Server Policy Management            
Utility         SQLSERVER:\Utility             SQL Server Utility                      
SQLRegistration SQLSERVER:\SQLRegistration     SQL Server Registrations                
SQL             SQLSERVER:\SQL                 SQL Server Database Engine              
SSIS            SQLSERVER:\SSIS                SQL Server Integration Services         
XEvent          SQLSERVER:\XEvent              SQL Server Extended Events              
DatabaseXEvent  SQLSERVER:\DatabaseXEvent      SQL Server Extended Events              
SQLAS           SQLSERVER:\SQLAS               SQL Server Analysis Services            


List of Instances

InstanceName : 



List of Databases on Local Server default instance

Name : EMPLOYEES


Name : NealDemo


Name : ReportServer


Name : ReportServerTempDB


Name : VideoGenerator



Alternate List of Databases on Local Server default instance
WARNING: Using provider context. Server = localhost.



name                                                                        database_id create_date                                
----                                                                        ----------- -----------                                
master                                                                                1 4/8/2003 9:13:36 AM                        
tempdb                                                                                2 4/16/2015 11:42:24 PM                      
model                                                                                 3 4/8/2003 9:13:36 AM                        
msdb                                                                                  4 2/20/2014 8:49:38 PM                       
ReportServer                                                                          5 12/11/2014 5:42:06 PM                      
ReportServerTempDB                                                                    6 12/11/2014 5:42:07 PM                      
NealDemo                                                                              7 1/8/2015 1:31:34 PM                        
VideoGenerator                                                                        8 1/12/2015 2:08:16 PM                       
EMPLOYEES                                                                             9 1/20/2015 2:21:54 PM                       




List of tables in NealDemo Database

Schema                       Name                           Created               
------                       ----                           -------               
Audit                        AuditAllExclusions             2/4/2015 8:21 AM      
Audit                        AuditBaseTables                2/4/2015 8:21 AM      
Audit                        AuditDetail                    2/4/2015 8:21 AM      
Audit                        AuditDetailArchive             2/4/2015 8:21 AM      
Audit                        AuditHeader                    2/4/2015 8:21 AM      
Audit                        AuditHeaderArchive             2/4/2015 8:21 AM      
Audit                        AuditSettings                  2/4/2015 8:21 AM      
Audit                        SchemaAudit                    2/4/2015 8:21 AM      
dbo                          Employee                       2/4/2015 3:55 PM      


Run some SQL Command
WARNING: Using provider context. Server = localhost, Database = NealDemo.

Column1 : Microsoft SQL Server 2014 - 12.0.2000.8 (X64) 
              Feb 20 2014 20:04:26 
              Copyright (c) Microsoft Corporation
              Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)
          
Column2 : NealDemo

End



PS SQLSERVER:\sql\localhost\DEFAULT\Databases\NealDemo\Tables> 

Sometimes you need to quickly do a mass rename of a large number of files in a directory.

Rename a file like this: 
ABC_20150321124112_1801.xml 
to a filename like this:
XYZ_201503211241.xml

Sample Code

cls
cd "C:\TempRename" 
Get-ChildItem -Filter *.xml | Foreach-Object{   
   $NewName = $_.Name -replace "ABC_(.*?)\d{2}_\d{4}.xml", "XYZ_`$1.xml"
   Write-Host $NewName
   Rename-Item -Path $_.FullName -NewName $NewName
}

Step by Step Explanation

1. The CD shouldn’t be needed, but if you are running Powershell in a different directory in ISE, it can be helpful.
2. Get-ChildItem returns all files in the current directory with the mask *.xml
3. Then “ForEach” matching file, do what is in the curly brackets of the Foreach-Object loop. If you know what you are doing you can pipe without the Foreach, but I like to break it down, so I can add the debug Write-Host statements and run a simulation run (by commenting out the actual Rename-Item statement) before the final rename.
4. The -replace is the keyword that tells us that we are doing a RegEx replace. Here I’m changing a date like this: YYYYMMDDHHMMSS_xxxx to YYYYMMDDHHMM. (This was a requirement of the a customer. The downside is you could have duplicate files on the rename if more than one file was created in the same minute; but that was not our issue.)
I’m using the () to capture the YYYYMMDDHHMM string and then the `$1 substitutes that string back into the new filename. The grave-accent mark is the escape character to tell Powershell that I don’t want to insert a variable by the name $1 (which would have a value of null or empty-string, because I don’t have such a variable. $1 is used only with the Powershell replace, it’s not a real Powershell variable.
5. Write-Host shows the new filename.
6. Do the actual rename. Just comment out this line with # at the beginning to do a simulation run and verify the names.