Using Workflow

This topic contains 13 replies, has 4 voices, and was last updated by Profile photo of Chris Lewis Chris Lewis 3 years, 1 month ago.

  • Author
    Posts
  • #12081
    Profile photo of Patrick Metcalfe
    Patrick Metcalfe
    Participant

    Trying to write a workflow to ping servers from a text file
    Syntax is fine I think

    Using a function it takes 30 Minutes to 'ping' 4000 Servers using measure-command cmdlet
    using a workflow it takes only 1 second to run

    Seems it is not running the foreach loop

    How do I know the servers that pinged correctly and the ones that did not
    since you cannot use write-host in a workflow
    Powershell version is 4.0 Windows 2008 SP2
    Here is the script

    $Servers = Get-Content D:\PowerShell\Servers.txt
    workflow WFConnection {
    foreach -parallel ($Server in $Servers){
    Test-Connection -ComputerName $Server -Count 1 -ErrorAction silentlycontinue
    }
    }
    WFConnection

  • #12082
    Profile photo of Don Jones
    Don Jones
    Keymaster

    You've got a scope problem. $servers is created outside the workflow. While in a normal PowerShell script that would work, it doesn't work with workflow. The WWF engine only gets what's inside the workflow construct. You should parameterize the workflow and pass in the computer names as a parameter. As-is, $servers inside the workflow is empty, so the foreach doesn't execute.

    You're also not really capturing your output (from Test-Connection) or doing anything with it. Not sure if that was your intent or not. If you want to keep track of which servers respond to a ping, you'd need to put that output someplace – a file maybe, but it's up to you. I might also use the -quiet switch of Test-Connection, since that just returns a true/false. If you get true, write the server name to one file. If you get false, write to another file. Or whatever you need.

  • #12151
    Profile photo of Patrick Metcalfe
    Patrick Metcalfe
    Participant

    Ah OK , Thanks Don

    Now I do I write to a file within the workflow
    Cannot find how to do this in any of my books (I have many including yours) or on the internet

    tried
    {inlinescript write-host|out-file $Server d:\up.txt -append}

    but get errors

    While I am here any PowerShell 4.0 books in the pipeline?

    Thanks again

    Patrick

  • #12152
    Profile photo of Don Jones
    Don Jones
    Keymaster

    It'd be nice to know what the errors are ;). In general it's fine to write to a file from a workflow except that (a) you can't redirect Write-Host like you're doing, that's why we always tell people to not use Write-Host, and (b) you have to pay attention to the paths and permissions since this is bring run by WWF.

    Stop using Write-Host. Always.

    We're updating PowerShell In Depth for 4.0, and I may do a DSC book with Steven Murawski, but nothing's firm on that.

  • #12153
    Profile photo of Patrick Metcalfe
    Patrick Metcalfe
    Participant

    Yes I agree just read your article on write-host
    I am right to assume that write-verbose would work

    Patrick

  • #12154
    Profile photo of Don Jones
    Don Jones
    Keymaster

    Nope.

    When you use the pipeline character, you're only able to pass along the "main" pipeline in PowerShell.

    Write-Host "This" | Out-File nothing.txt
    Write-Output "That" | out-File something.txt
    

    The second will work. Write-Verbose doesn't place information into the main pipeline; it, -Warning, -Debug, and -Error all have their own pipelines. There's a syntax that lets you switch info from one pipeline to another, but in this case, Write-Output is what you want. You can also just do

    "Message" | Out-File works.txt
    

    Since Write-Output is the default cmdlet; that syntax will put "Message" into works.txt.

  • #12155
    Profile photo of Patrick Metcalfe
    Patrick Metcalfe
    Participant

    OK Thanks Don

    I will give that a Go
    Happy Christmas

  • #12206
    Profile photo of Patrick Metcalfe
    Patrick Metcalfe
    Participant

    Writes to a file with the append switch but get the following error below even though the list of servers are in the file up.txt – throttle limit?

    Microsoft.PowerShell.Utility\Write-Error : The process cannot access the file 'D:\up.txt' because it is being used by another process.

    Patrick

  • #12207
    Profile photo of Don Jones
    Don Jones
    Keymaster

    No, you're dealing with simultaneous file access. Remember, you've got multiple copies of this all running at once. They can't all write to the same file at the same time. That's a basic tenet of the Windows file system. Two processes can't both access the same file object at once, unless they take special precautions (which, for writing to a simple text file, isn't an option).

  • #12208
    Profile photo of Don Jones
    Don Jones
    Keymaster

    This is, by the way, one of the more common things you run across when you start parallelizing tasks – a better approach would perhaps be to log the information to a SQL Express database, since SQL itself supports multi-user access. However, that's a lot of extra work. That's the thing with Workflow, though – you start to get into "a lot of extra work" pretty quickly because of the underlying complexities of parallel processing.

    You could also have each workflow instance log to a separate file, wait until they're done, and then concatenate them all into a single file. Again, more work.

  • #12241
    Profile photo of Joakim Svendsen
    Joakim Svendsen
    Participant

    Figure I'll contribute with some actual, working code (unless I misunderstood the requirements). I had been putting off learning workflows for a good while, but started looking into it now. I started with reading Richard Siddaway's article on Technet about the basics:
    http://blogs.technet.com/b/heyscriptingguy/archive/2012/12/26/powershell-workflows-the-basics.aspx

    After quite a bit of experimentation, failures and learning, I cooked up this, which works:

    Workflow Test-MultiConnection {
        
        param([string[]] $ComputerName)
        
        foreach -Parallel ($Computer in $ComputerName) {
            $Obj = New-Object -Type PSObject -Property @{
                ComputerName = $Computer
                Online       = Test-Connection -ComputerName $Computer -Count 1 -Quiet
            }
            $Obj
        }
    
    }
    
    $ComputerName = Get-ADComputer -Filter * | Select -ExpandProperty Name
    Test-MultiConnection $ComputerName | Select -Property ComputerName, Online | Format-Table -AutoSize
    

    In my lab env. these are the results:

    PS C:\temp> .\Test-MultiConnection.ps1
    
    ComputerName  Online
    ------------  ------
    WIN2012R2       True
    WIN8ESXI        True
    SERVER2012      True
    WIN8VM         False
    SERVER2012RC   False
    SS-WIN7         True
    VMWAREWIN7     False
    WINXPSSD        True
    esxi           False
    SERVER2008      True
    XPTANKET       False
    SERVER2003     False
    VISTA64ESXI    False
    WIN2K           True
    WINXPESXI      False
    WIN7ESXI        True
    SIEMENS        False
    SRV2003R2ESXI   True
    2008R2ESXI      True
    2008R2ESXI2     True
    

    If you remove Format-Table and instead pipe to Export-Csv, you can easily create a CSV file with the results. Or whatever you want. You can also toss in a Sort-Object on ComputerName or Online value. If you assign the results to a variable and process it (twice is probably easiest, shouldn't be much overhead) with Where-Object, you can also create one file for offline computers and one for online, by filtering on the Online attribute being "True" (or "False") or not.

    • #15225
      Profile photo of Chris Lewis
      Chris Lewis
      Participant

      This works nicely on 400 or so computer objects;
      Move Test-Connection outside of creating the object and put the commands in a sequence block.

      Workflow Test-MultiConnection {
      param([string[]] $ComputerName)
      foreach -Parallel ($Computer in $ComputerName) {
      sequence {
      $online = Test-Connection -ComputerName $Computer -Count 1 -Quiet
      $Obj = New-Object -Type PSObject -Property @{
      ComputerName = $Computer
      Online = $online
      }
      $Obj
      }
      }
      }
      Test-MultiConnection $Computers | Select -Property ComputerName, Online | Format-Table -AutoSize

  • #12256
    Profile photo of Joakim Svendsen
    Joakim Svendsen
    Participant

    I can't seem to edit the last post anymore, so I'm just throwing in a link to a tiny article about the Split-Collection cmdlet in a new comment:
    http://www.powershelladmin.com/wiki/PowerShell_Cmdlet_for_Bulk_Processing_of_an_Array

  • #12244
    Profile photo of Joakim Svendsen
    Joakim Svendsen
    Participant

    Experimenting some more, I came across a scenario when the number of computer increases from 18 to over 2000, where I got the following error:

    The operation did not complete within the allotted timeout of 00:00:30. The time allotted to this operation may have
    been a portion of a longer timeout.
    At Test-MultiConnection:10 char:10

    I searched the web and found this article: http://social.msdn.microsoft.com/Forums/vstudio/en-US/efc96848-9d9d-46e6-9c08-179ff5747a4d/workflowservicehost-limitation-in-the-number-of-running-instance?forum=wfprerelease

    I also found a reference to a no-longer-existing article by someone who wrote a "Foreach-Parallel" cmdlet. I think the correct way to deal with this is via the New-PSWorkflowSession cmdlet where you can customize throttle limit, concurrent sessions, etc.

    I couldn't figure out how to apply the settings to a specific workflow (maybe it affects the current PS session all together?), so I wrote a little function/cmdlet that I've found myself needing from time to time. This won't perform well on large collections due to, among other things, array concatenation, but should be fine for many purposes. I probably should have figured out the correct way, but now I'm posting it anyway. 😐

    I called it Split-Collection and the code looks like this:

    function Split-Collection {
        [CmdletBinding()]
        param(
            [Parameter(ValueFromPipeline=$true)] $Collection,
            [Parameter(Mandatory=$true)][ValidateRange(1, 247483647)][int] $ChunkSize
        )
        
        begin {
            $Ctr = 0
            $Arrays = @()
            $TempArray = @()
        }
        
        process {
            if (++$Ctr -eq $ChunkSize) {
                $Ctr = 0
                $Arrays += , @($TempArray + $_)
                $TempArray = @()
                return
            }
            
            $TempArray += $_
            
        }
        
        end {
            if ($TempArray) { $Arrays += , $TempArray }
            $Arrays
        }
        
    }
    

    Its behavior can be demonstrated like this:

    PS C:\PowerShell> 1..10 | Split-Collection -ChunkSize 2 | %{ $_ -join ', ' }
    1, 2
    3, 4
    5, 6
    7, 8
    9, 10
    PS C:\PowerShell> 1..10 | Split-Collection -ChunkSize 3 | %{ $_ -join ', ' }
    1, 2, 3
    4, 5, 6
    7, 8, 9
    10
    PS C:\PowerShell> 1..10 | Split-Collection -ChunkSize 5 | %{ $_ -join ', ' }
    1, 2, 3, 4, 5
    6, 7, 8, 9, 10
    PS C:\PowerShell>
    

    Then I did this, processing 50 at a time, which, as intended, seems to work around the issue:

    PS C:\PowerShell> $Results = $Computers[0..300] | Split-Collection -ChunkSize 50 | %{ Test-MultiConnection $_ }
    PS C:\PowerShell> $Results.Count
    301
    PS C:\PowerShell> $Results | select -first 3 | ft -a computername, online
    
    ComputerName Online
    ------------ ------
    comp1       False
    comp2       True
    comp3       False
    

    Shrug. If someone can fill in the blanks on the throttling/session limits, it'd be nice.

You must be logged in to reply to this topic.