Accepting pipeline input in advanced functions

This topic contains 0 replies, has 1 voice, and was last updated by Profile photo of Forums Archives Forums Archives 5 years, 6 months ago.

  • Author
    Posts
  • #5997

    by Lembasts at 2013-01-16 16:31:00

    I am trying to do the right thing by following the golden rule – pipeline in , pipeline out.
    I have a function which writes objects with several properties out to a pipeline – computername, lastlogon and lastpwdset.
    I want to code another function to accept and process each of the objects as pipeline input.
    From my research it appears I can do this with a process block and refer to $_.computername and $_.lastlogon etc and the process block will run for each object instance?
    There is also a mention of doing this in the parameter block with valuefrompipeline but that only works if there is one property?
    Problem is that I need to access each object as an array i.e. using a [$idx] to refer to each object in the pipeline, so using the process block and $_. doesnt appear to be the solution.
    Any ideas please?

    by nohandle at 2013-01-17 00:22:31

    Hi,
    There is better way than $_. You can define parameter which takes all input from pipeline by specifying the
    valuefromPipeline parameter option. Or set of them that will get variable from same named properties on the input object. Path -> Path, Name-> name using the valueFromPipelineByPropertyName parameter option.

    Using the process block will take each object in the pipeline and process it.

    I'll show you some examples when I get to PC.

    by nohandle at 2013-01-17 02:42:47

    function foo {
    param ( [parameter(
    ValueFromPipelineByPropertyName=$true)]
    $name,
    [parameter(
    ValueFromPipelineByPropertyName=$true)]
    $id
    )
    process
    {
    "name: $name, id: $id"
    }
    }
    Get-Process | foo

    by Lembasts at 2013-01-17 03:02:05

    The problem for my script is that I cannot process each incoming piped object one by one. I need to be able to refer to the pipeline input object instance via an array index.
    At present my solution is for the first script to output the data to a csv file and the second to input a csv file.

    by nohandle at 2013-01-17 04:03:52

    Can you show an example because I am not sure I get what you mean<
    You accept an array object on the input and want to be able to process every item in it?
    similar to this?
    $arrayofArrays = [char[]]'thisisarrayofchars',[char[]]'andthisisanother'
    $arrayofArrays |
    foreach {
    $currentObject = $_
    #write out the current object in one line
    "$currentObject"

    #Write out each item in the array
    $currentObject | foreach {$_}
    }

    by RichardSiddaway at 2013-01-19 08:38:32

    Not sure why you need to use an index but if I've understood you requirements you want something like this. I'm working on disk information and the example is forced but it does illustrate the point

    Start with a function that outputs objects containing disk information
    function get-mydisk{
    [CmdletBinding()]
    param (
    [string]$computername="$env:COMPUTERNAME"
    )
    BEGIN{}#begin
    PROCESS{
    Get-WmiObject -Class Win32_LogicalDisk -ComputerName $computername |
    foreach {
    New-Object -TypeName PSObject -Property @{
    Disk = $_.DeviceID
    Free = $_.FreeSpace
    Size = $_.Size
    }
    }
    }#process
    END{}#end
    }

    Get a computername as a parameter. Use WMI to get the logical disk info and output the disk name (deviceid) freespace and size as an object. This puts a set of objects on the pipeline.

    You do this
    get-mydisk | where Size -gt 0

    and it works

    Now you want to pipe that information to another function that will calculate the percentage of free space

    function get-freeperc {
    [CmdletBinding()]
    param (
    [parameter(ValueFromPipeline=$true)]
    [Object[]]$disklist
    )
    BEGIN{}#begin
    PROCESS{

    foreach ($disk in $disklist){
    if ($disk.Size -gt 0){
    $disk | Select Disk,
    @{N="Size(GB)"; E={[math]::Round( ($($_.Size)/1GB), 2 )}},
    @{N="FreePerc(GB)"; E={[math]::Round( ($($_.Free)/1GB), 2 )}}
    }
    }

    }#process
    END{}#end
    }

    The important points:
    – set the object type
    – set it as an array coming in
    – set ValueFromPipeline = $true
    – use a PROCESS block
    – use a foreach block within the PROCESS block

    get-mydisk | get-freeperc

    Enjoy

    by RichardSiddaway at 2013-01-19 09:03:24

    Oops – made a mistake in the second function
    function get-freeperc {
    [CmdletBinding()]
    param (
    [parameter(ValueFromPipeline=$true)]
    [Object[]]$disklist
    )
    BEGIN{}#begin
    PROCESS{

    foreach ($disk in $disklist){
    if ($disk.Size -gt 0){
    $disk | Select Disk,
    @{N="Size(GB)"; E={[math]::Round( ($($_.Size)/1GB), 2 )}},
    @{N="FreePerc"; E={[math]::Round( ($($_.Free) / $($_.Size))*100, 2 )}}
    }
    }

    }#process
    END{}#end
    }

    by Lembasts at 2013-01-20 16:51:38

    Thanks – Im on holidays now so will respond more in detail later.
    As I mentioned I have one function that outputs a csv file and a second that reads the csv file. I want to do this via a pipeline. I have changed the first function to output to a pipeline. Im just trying to work out how to code the second function to read from the pipeline. The trick is the index. My second function is sort of like the following:
    Import the csv
    $infile = import-csv c:\compdata.csv
    Set the initial index
    $idx = 0
    Access the computer name property of the desired object instance
    $infile[$idx].computername
    And at various stages through a loop I might do
    $idx++
    or
    $idx--
    So I am not processing the incoming pipeline object instances one by one. I need to be able to traverse them as required.
    I dont think this can be done via traditional pipeline input and a process block.

    by nohandle at 2013-01-21 00:23:28

    Ok, if you do it like you describe you have an array of objects in the inFile variable. Each of these objects is one row of the document and has properties set based on header of each column.
    As you correctly described you can get any of the object in the array (collection) by specifying its index. To iterate through them use 'for' loop. It gives you control over the index – you can increase it or decrease it and even control how big the steps are.
    Moving to function you can populate the array using pipeline input, you just need to collect the data in the process block an loop over them in the end block. Please be aware that this approach slows down the processing if more commands are following in the pipeline, because they have to wait until your command finishes. If your command is not followed by anything and the result is output to screen it gives the user the impression the command is slower because the data are shown after all are done, not during the process.

    I am on mobile now, example will follow up when I get to PC.

    by nohandle at 2013-01-21 01:02:24

    function Get-NameId246642
    {
    param(
    [Parameter(Mandatory=$True,ValueFromPipeline=$true)]
    $InputObject
    )
    begin
    {
    $inputData= @()
    }
    process
    {
    $inputData += $InputObject
    }
    end
    {
    for ($index = 2; $index -le 6; $index+=2)
    {
    $inputData[$index] | select Name, Id
    }

    for ($index = 6; $index -ge 2; $index-=2)
    {
    $inputData[$index] | select Name, Id
    }
    }
    }
    #get me some data in csv format
    $processCsv = Get-Process | Select-Object -First 10 | ConvertTo-Csv

    #"import" csv
    $process = $processCsv | ConvertFrom-Csv

    $process | Get-NameId246642

    by Lembasts at 2013-01-28 14:50:41

    Thanks for all your help. I had a long think about my functions and I have found a way to process each instance only once through the PROCESS block rather than going back and forth using indexes. I think it'll be easier to understand....

    by nohandle at 2013-01-29 03:16:50

    Great, would you mind sharing the solution for other to see?

    If the issue is solved please mark the topic as Solved using the Solved button.

    by Lembasts at 2013-01-29 03:52:22

    Ive only just worked out that the green tick is "solved". I was looking for a button that had the text "solved" on it.
    My solution is lengthy and varied.
    What I have is a "database" which is a csv file. I have a number of "transaction files" with different information that update my csv database.

    by Lembasts at 2013-01-29 13:35:02

    As an example in reading from the pipline one of my parameters is as follows:
    [Parameter(
    ValueFromPipelineByPropertyName=$true)]
    $ComputerName,

    And is kindly mentioned earlier, I can just refer to this property as $computername in the PROCESS block for each instance.
    However, in looking at some examples on the web, it appears that the above code would not work – I need to add the ValueFromPipeline as well? e.g.
    [Parameter(
    ValueFromPipeline=$true,
    ValueFromPipelineByPropertyName=$true)]
    $ComputerName,

    by Lembasts at 2013-01-29 13:56:23

    And then there's this I found:
    https://connect.microsoft.com/PowerShell/feedback/details/498238/function-parameter-attribute-valuefrompipelinebypropertyname-overridden-by-valuefrompipeline
    which seems to suggest that I dont need valuefrompipeline but that I need to type my property thus:
    [Parameter(
    ValueFromPipelineByPropertyName=$true)]
    [string]$ComputerName,

    and if a property comes in as a string but is actually a date, can I type it as a date or does it have to be string?

You must be logged in to reply to this topic.