Invoke-Command vs

This topic contains 3 replies, has 2 voices, and was last updated by Profile photo of Rodney Stewart Rodney Stewart 1 month ago.

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #51771
    Profile photo of Aaron Hardy
    Aaron Hardy
    Participant

    Hello community,

    To leverage on parallelism, I am using Invoke-Command -File abc.ps1 to ship several 'Get-' functions to remote computers that run them locally, and then the output is shipped back (deserialized objects are ok).

    Now, I would like to execute the individual functions locally from my computer. By doing this, I am looking at a foreach loop and/or using -ComputerName in the 'Get-' cmdlets which breaks the parallelism and slows down processing time.

    Any suggestions to how I might accomplish this? Jobs?

    Thanks,
    Aaron

    #51776
    Profile photo of Rodney Stewart
    Rodney Stewart
    Participant

    Hi Aaron, can you post a code sample of what you're attempting so I can understand the question a bit better?

    #51785
    Profile photo of Aaron Hardy
    Aaron Hardy
    Participant

    Sure, Rodney.

    This is my current approach:

    $RemoteSession = New-PSSession -ComputerName comp1,comp2,comp3,comp4,comp5 -Credential $Credential
    
    # functions.ps1 contains functions like Get-Services, Get-Process, 
    # Get-CustomFunction1, Get-CustomFunction2, etc. that execute 
    # locally on the remote computer. Output is exported to a local 
    # path of the remote computer (D:\share).
    
    Invoke-Command -Session $RemoteSession -FilePath X:\Scripts\functions.ps1 
    
    # Run Import-CliXML manually to output results to console of local computer.
    
    Import-CliXML \\RemoteComputer\share\abc.xml
    

    Rather than using "Invoke-Command -FilePath functions.ps1" for local execution on remote computers, the functions would have to be declared on my local session, which requires the use of -ComputerName or -CimSession on those cmdlets/functions, and only one remote computer can be processed at a time. The "one-at-a-time" is what I am trying to circumvent, if possible.

    Hope this makes things clearer.
    Aaron

    #51788
    Profile photo of Rodney Stewart
    Rodney Stewart
    Participant

    I believe that the only way to avoid this would be through the use of PowerShell jobs. Unless some of these commands are extremely long running though, I'm not sure how much of a speed benefit you'll see versus the pain of implementation. Something like the code below should work. I'm just getting started with Jobs myself, so there may be a better way to do this, but this should work.

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.