2015-10-27

Communication between Powershell script and Jobs

I had an interesting task of collecting and displaying some runtime metrics during the benchmark. Since data was clearly divided between OS and processes, I opted for separate background jobs in which case no task will stall the other. However, this means I had to establish communication between main script code and jobs which is tricky in Powershell (see this bug for example). Immediately, I thought of using named pipes or files (as I do have small, ever changing set of data). Further investigation revealed 4 ways of accomplishing this:
  1. Using files
    • pros:
      • Tried and tested; works everywhere.
      • Intuitive.
      • File management routines are mature.
    • cons:
      • Resource lock/leak possible.
      • Not really fit for big amount of data.
      • Local.
  2. Using named pipes
    • pros:
      • Tried and tested; works everywhere.
      • Network.
    • cons:
      • Complicated to get right.
      • Blocking. Pipe has to get consumed before proceeding.
      • Async. Even more complicated to manage.
      • One should ensure unique pipe name.
      • No easy way to do cleanup if things go wrong.
  3. Using custom (Engine)Events
    • pros:
      • "Proper" Windows way.
      • No "weird" code in script.
    • cons:
      • Hard to control; not fit for all the scenarios.
  4. Using UDP/IP
    • pros:
      • Fast.
      • Fit for purpose (packet loss is acceptable).
    • cons:
      • Getting free ephemeral port.
      • Needs additional transport for initial Port negotiation.

Using files:

       
$job1 = Start-Job -ScriptBlock {
...
    #Permanent value.
    $job1out = (dir Env:\TEMP).Value + '\job1out.txt'
    $job1in = (dir Env:\TEMP).Value + '\job1in.txt'
    if (Test-Path $job1in) { #Ready to go. }

    do {
        #Do the work, pipe output to file:
...  
            Select-Object -First 20 | FT * -Auto 1> $job1out
}

$job2 = Start-Job -ScriptBlock {
    #Permanent values.
    $job2out = (dir Env:\TEMP).Value + '\job2out.txt'
    $job2in = (dir Env:\TEMP).Value + '\job2in.txt'
    if (Test-Path $job2in) { #Ready to go. }

    do {
        #Do the work
...  
        #Utilize fast FS classes to output the result:
 $stream = [System.IO.StreamWriter] ($job2out)
        $stream.WriteLine("")
        $stream.close()
... }
}

#Consolidate in main script code:
$job1out = (dir Env:\TEMP).Value + '\job1out.txt'
$job2out = (dir Env:\TEMP).Value + '\job2out.txt'
do {
    $l = Get-Content $job1out -ErrorAction SilentlyContinue
    $tc = Get-Item $job1out -ErrorAction SilentlyContinue
 #Make sure to display only fresh data.
    if (($l) -and ($tc) -and ($tc.LastWriteTime -gt $lastTimeHeader)) {
        $lastTimeHeader = Get-Date
    } else { $l = $null }

    $l1 = Get-Content $job2out -ErrorAction SilentlyContinue
    $tc = Get-Item $job2out -ErrorAction SilentlyContinue
 #Make sure to display only fresh data.
    if (($l1) -and ($tc) -and ($tc.LastWriteTime -gt $lastTimeProc)) {
        $lastTimeProc = Get-Date
    } else { $l1 = $null }
...
}
#Do the cleanup:
    Stop-Job -Job $job1
    $null = Receive-Job $job1
    Remove-Job $job1

    Stop-Job -Job $job2
    $null = Receive-Job $job2
    Remove-Job $job2
    
Gotcha: Background job is not aware of session settings (say, $PSScriptRoot) thus you need to utilize machine-wide variables, such as Env:TEMP.

Using named pipes (unfit for my particular use-case):

       
#Create 2 pipe servers; one for each background job.
$pipeH = new-object System.IO.Pipes.NamedPipeServerStream 'pipe1','In', 1, "Message"
$srH = new-object System.IO.StreamReader $pipeH
#--2nd pipe has 2-way communication
$pipeT = new-object System.IO.Pipes.NamedPipeServerStream 'pipe2','InOut', 1, "Message"
$srT = new-object System.IO.StreamReader $pipeT
$swT = new-object System.IO.StreamWriter $pipeT
#--
#Start Job1:
$job1 = Start-Job -ScriptBlock {
...
    #Permanent value.
    $pipeH1 = new-object System.IO.Pipes.NamedPipeClientStream '.', 'pipe1','Out'
    $pipeH1.Connect()
    $swH = new-object System.IO.StreamWriter $pipeH1
    $swH.AutoFlush = $true

    do {
        #Do the work, pipe output to StreamWriter:
...  
        $swH.WriteLine($ln0)
        $swH.WriteLine("")

}

#In main code, wait for traffic to start:
$pipeH.WaitForConnection()
cls
#Say we read 11 lines.
$l = $srH.ReadLine()
if ($l -ne $null) {
  $l
  for ($i = 1; $i -le 10; $i++)
  { 
    $l = $srH.ReadLine()
    if ($l -ne $null) {
      $l
    }      
  } 
}

#Start Job2:
$job2 = Start-Job -ScriptBlock {
    #Permanent values.
    $pipeT = new-object System.IO.Pipes.NamedPipeClientStream '.', 'pipe2','InOut'
    $pipeT.Connect()
    $srT = new-object System.IO.StreamReader $pipeT
    $srT.AutoFlush = $true
    $swT = new-object System.IO.StreamWriter $pipeT

    do {
        #Do the work, pipe output to StreamWriter:
...  
            Select-Object -First 20 | FT * -Auto 1> $swT
...
            #Check communication:
            if (($tmp = $srT.ReadLine()) -ne $null) {...}
   }
}

#Finish setting things up:
$swT.AutoFlush = $true
$pipeT.WaitForConnection()
#Say we read 27 lines from 2nd job.
$l = $srT.ReadLine()
if ($l -ne $null) {
  $l
  for ($i = 1; $i -le 26; $i++)
  { 
    $l = $srT.ReadLine()
    if ($l -ne $null) {
      $l
    }      
  } 
}

#And we send some info to 2nd job:
$swT.WriteLine('Something')

#You can read in loop too but make sure to allow for flow control between processes:
while (($tmp= $srT.ReadLine()) -ne 'something') 
{
  ...
}

#Do the cleanup:
    Stop-Job -Job $job1
    $null = Receive-Job $job1
    Remove-Job $job1

    Stop-Job -Job $job2
    $null = Receive-Job $job2
    Remove-Job $job2

    $pipeH.Close()
    $pipeH.Dispose()
    $srH.Close()
    $srH.Dispose()
    $pipeT.Close()
    $pipeT.Dispose()
    $srT.Close()
    $srT.Dispose()
    $swT.Close()
    $swT.Dispose()

   
Gotcha: NamedPipeServer/ClientStream have many constructors. Please check Server and Client documentation.
Good read on subject:Georg Begerow, MSDN
Gotcha: You can work without streams completely (ie. just with pipe object):
       
$enc = [system.Text.Encoding]::Default
$msg = $enc.GetBytes("Message");
$pipeSort.Write($msg, 0, $msg.Length)
$pipeSort.WaitForPipeDrain()
$pipeSort.Flush()
--
do
{
    $SortBy += $pipeT.ReadByte().ToChar()
}
while (!($pipeT.IsMessageComplete))
$pipeT.Flush()
   
Gotcha: I have found no way for this to work in my particular case (Async is out of question). Ie. Read/Peek... is always blocking thus preventing repeated updates. I guess this could be worked around by disposing and re-creating the client pipe but that's suboptimal.

Using custom (Engine)Events:

       
#Start Job1:
$job1 = Start-Job -ScriptBlock {
...
    Register-EngineEvent -SourceIdentifier Job1Message -Forward
    do {
        #Do the work, raise Event:
...  
        $null = New-Event -SourceIdentifier Job1Message -MessageData $your_result
    }

}

#Start Job2:
$job2 = Start-Job -ScriptBlock {
    Register-EngineEvent -SourceIdentifier Job2Message -Forward
    do {
        #Do the work, raise Event:
...  
        $null = New-Event -SourceIdentifier Job2Message -MessageData $your_result
    }
}

#In main code, I want to process results synchronously, 
#thus a bit weird approach instead of just using -Action {} scriptblock:
do {
    $OldErrorActionPreference = $ErrorActionPreference
    $ErrorActionPreference = "SilentlyContinue"
    $tmp = ''
    $tmp = (Get-Event -SourceIdentifier Job1Message).MessageData | Select -Last 1
    if ($tmp.Length) {
        #Do work with $tmp
    }
    $tmp = ''
    $tmp = (Get-Event -SourceIdentifier Job2Message).MessageData | Select -Last 1
    if ($tmp.Length) {
        #Do work with $tmp
    }
    $ErrorActionPreference = $OldErrorActionPreference
}

#and the cleanup:
    $OldErrorActionPreference = $ErrorActionPreference
    $ErrorActionPreference = "SilentlyContinue"
    Remove-Event -SourceIdentifier "Job1Message"
    Remove-Event -SourceIdentifier "Job2Message"
    $ErrorActionPreference = $OldErrorActionPreference

    $job1 | Stop-Job -PassThru| Remove-Job
    $job2 | Stop-Job -PassThru| Remove-Job

   
Gotcha: Unfortunately, I was unable to make Event propagate from main script to background job using same approach which makes me think there are more things to learn here and/or there is a bug in PS v3 regarding events processing.
Gotcha: Register-EngineEvent ... -Action {} will start another (child)background job which will run in different runspace thus the options for communication are running scarce.


Using UDP/IP:

Based on PowerTip.
       
#To be executed in each script/job:
function Send-Text($Text='Sample Text', $Port=2500) {
    $endpoint = New-Object System.Net.IPEndPoint ([IPAddress]::Loopback,$Port)
    $udpclient= New-Object System.Net.Sockets.UdpClient
    $bytes=[Text.Encoding]::ASCII.GetBytes($Text)
    $bytesSent=$udpclient.Send($bytes,$bytes.length,$endpoint)
    $udpclient.Close()
}

function Start-Listen($Port=2500) {
    $endpoint = New-Object System.Net.IPEndPoint ([IPAddress]::Any,$Port)
    $udpclient= New-Object System.Net.Sockets.UdpClient $Port
    $content=$udpclient.Receive([ref]$endpoint)
    [Text.Encoding]::ASCII.GetString($content)
} 

#To determine if Port is free:
[System.Net.NetworkInformation.IPGlobalProperties]::GetIPGlobalProperties() | %{$_.GetActiveTcpListeners() } | 
  Select Port -Unique | Where {$_.Port -eq your_port}
#or
function IsLocalPortListening([int16] $LPort)
{
 <#
 .SYNOPSIS
 Method to check if local port is available. This is used to determine free
    port for deployment.
 #>
    Try 
    {
        $connection = (New-Object Net.Sockets.TcpClient)
        $connection.Connect("127.0.0.1",$LPort)
        $connection.Close()
        $connection = $null
        return "Listening"
    } Catch {
        $connection.Close()
        $connection = $null
        return "Not listening"
    }
}
   
Gotcha: This approach is unfit for my use-case since I would need another delivery mechanism to establish Port to communicate on.

Conclusion:

Given the task of communicating results from background jobs to main script, "writing to files" approach worked the best. Little overhead, single mechanism, common cmdlets and so on.
Events approach appear to work most smoothly but for the fact that I was not able to get job to process event from main script code. Since there were no subscribers to the event, I am also concerned about the quantity of Events generated (2/sec).
Named pipes proved too much for me and I never got them to work as expected while UDP/IP would require another delivery mechanism to sort out initial Port settings.


Happy coding!

2015-10-22

Regular checks before running Powershell script + Write-Error replacement

On occasion, one will produce the script that will not work in _ISE or on some particular version of Powershell. So, before allowing script to run, I always do several checks depending on the task at hand. Here's the code:

       
#region Check
if ($host.Name -ne 'ConsoleHost')
{
    #Running in ISE
    $host.UI.WriteErrorLine("`t Script can not be run in _ISE. Exiting.")
    Exit 1
}

if (($PSVersionTable).PSVersion.Major -lt 3) {
    $host.UI.WriteErrorLine("`t Script can not be run in PS 2. Exiting.")
    Exit 2
}

$clrV = 
  ((Get-ChildItem 'HKLM:\SOFTWARE\Microsoft\NET Framework Setup\NDP' -recurse |
  Get-ItemProperty -name Version,Release -EA 0 |
  Where { $_.PSChildName -match '^(?!S)\p{L}'} |
  Select Version | Sort Version -Desc | Select -First 1).Version).Split('.')[0]
if ( $clrV -lt 4) {
    $host.UI.WriteErrorLine("`t Script can not be run in .NET v"+$clrV+". Exiting.")
    Exit 3
}

Set-StrictMode -Version Latest
Set-PSDebug -strict

#endregion
       

Explanation:
  • Regions are great for increased readability, especially when script is big.
  • There are differences between Powershell Console and ISE (see points 3 and 4 for example):
    PS C:\Users\user> $host.Name
    ConsoleHost
    PS:ISE [BOX]> $Host.Name
    Windows PowerShell ISE Host
  • I have PS v3 on my laptop and PS v4 on my Labs servers thus not coding (or testing) for older versions.
  • Getting the .NET version could have been much simpler if it wasn't for the fact that [environment]::version will get deprecated soon:
    PS C:\Users\user> [environment]::version
    Major  Minor  Build  Revision
    -----  -----  -----  --------
    4      0      30319  34209
  • Set-PSDebug -strict is there so that engine can throw an exception if a variable is referenced before being assigned a value.
  • Using $host.UI.WriteErrorLine produces much cleaner output, imo, than Write-Error.

Happy coding!

2015-10-19

Handling keyboard input and CTRL+C in Powershell without pausing

Recently, I had a requirement to update my script console output depending on user key-press. Since calculation was done by background threads, I also wanted to prevent CTRL+C from stopping the script without proper cleanup.
However, looks like seamless handling of key-press along with handling special sequences is not available in Powershell. This was to be expected given that [system.console]::readkey is designed for accepting the user input which is mostly answers to flow control questions. Register-EngineEvent does not help either.

Anyway, let's tackle both problems one by one:
1) Disable CTRL+C from stopping the script:
       
            [console]::TreatControlCAsInput = $true
       
Note that CTRL+BREAK will end the session entirely and do the proper cleanup thus it's not a problem.

2) Handle the user key-press:
       
if ($Host.UI.RawUI.KeyAvailable) { #Make sure there is something to handle.
    #$k = [system.console]::readkey($true) -This will wait for ENTER after any key-press, thus unacceptable.
    $k = $Host.UI.RawUI.ReadKey("AllowCtrlC,IncludeKeyDown,IncludeKeyUp,NoEcho").Character
    if ("p" -eq $k) {
        #Update the displaying
        $HOST.UI.RawUI.Flushinputbuffer() #Flush the key buffer
    } else {
        if ("r" -eq $k) {
            #Update the displaying
            $HOST.UI.RawUI.Flushinputbuffer()
        } else {
            if ("m" -eq $k) {
                #Update the displaying
                $HOST.UI.RawUI.Flushinputbuffer()
            } else {
                if ("n" -eq $k) {
                    #Update the displaying
                    $HOST.UI.RawUI.Flushinputbuffer()
                } else {
                    if ($k -eq "q") { #my wish was to map CTRL+C here but no way.
                        #Do the cleanup
                        Write-Host "Exiting...." -Background DarkRed
                        Stop-Job -Job $j
                        $null = Receive-Job $j
                        Remove-Job $j
                        ...
                        break;
                    }
                }
            }
        }
    }
}
       
The problem with proper solution is that one has to register his own functions and callbacks (system.console.cancelkeypress, system.console.canceleventhandler, set application-defined HandlerRoutine, install a control handler ...) which was just too much for the scope of my task especially since the library implementing desired behavior exists (PSEventing).

I do feel PowerShell should address this some time in the future through Register-Event cmdlets.