Begin, process,end processing order

This topic contains 0 replies, has 1 voice, and was last updated by Profile photo of Forums Archives Forums Archives 5 years, 4 months ago.

  • Author
    Posts
  • #6009

    by Lembasts at 2013-01-30 14:46:17

    Greetings,
    I noticed a funny thing the other day.
    I had just finished pipelining one function into another and it works well.
    For brevity lets say the function calls were as follows:
    get-ad | update-db

    Get-ad extracts information from active directory and outputs the results to the pipeline. It only has a Process block (no begin or end).
    Update-db reads the info directly from the pipeline. It has begin, process and end blocks.
    What I noticed was that the begin block in update-db was run before the process block in get-ad. Im guessing that this is for efficiency. IOW it thinks it has to get update-db ready for processing so that pipeline output from get-ad can be processed as it happens.
    Im also guessing that if I change the process block in get-ad to be a begin block, then the get-ad begin block will run before the update-db begin block and Ill lose the efficiency i.e. update-db will have to wait until all objects have been output from get-ad.

    by nohandle at 2013-01-31 01:38:01

    Hi, your observations are valid, although I don't think running the begin block of all commang in the pipline respectively (before the process block) is linked rather to correct processing of the pipeline that to efficiency.
    How the blocks behave under different conditions is AFAIK described in the PowerShell Cookbook (second edition), which I unfortunately do not have at hand to point you to the right chapter in case you own the book.

    Here are two quick examples:
    function one
    {
    begin {
    write-host 'one begins'
    }
    process {
    write-host 'one process'
    'output'
    }
    end {
    write-host 'one end'
    }
    }

    function two
    {
    begin {
    write-host 'two begins'
    }
    process {
    write-host 'two process'
    $_+'-two'
    }
    end {
    write-host 'two end'
    }
    }

    function three
    {
    begin {
    write-host 'three begins'
    }
    process {
    write-host 'three process'
    $_+'-three'
    }
    end {
    write-host 'three end'
    }
    }

    cls
    one | two | three

    one begins
    two begins
    three begins
    one process
    two process
    three process
    output-two-three
    one end
    two end
    three end

    ex2:
    function one
    {
    process {
    write-host 'one process'
    'output'
    }
    end {
    write-host 'one end'
    }
    }

    function two
    {
    process {
    write-host 'two process'
    $_+'-two'
    }
    }

    function three
    {
    begin {
    write-host 'three begins'
    }
    process {
    write-host 'three process'
    $_+'-three'
    }
    }

    cls
    one|two|three

    three begins
    one process
    two process
    three process
    output-two-three
    one end

    But the whole process is not as simple as all begins then all processes then all ends. Experiment:)

    by Lembasts at 2013-01-31 12:52:57

    Interesting. I guess the key point is that the Process block in however many cmdlets will always be processed in order (i.e. one process,two process, three process)

    by nohandle at 2013-01-31 23:41:01

    Not necessarily if the first command won't emit anything to the pipeline the other "processes"won't follow.
    Also if the first command consists of 'end' block (the default if you do not specify any of the blocks) the processing goes through the beguns then the end in tge first command then through the process in the following commands.
    Usage of $input variable also changes the behavior substantially if i recall correctly.

    by Lembasts at 2013-02-01 00:28:45

    This is one weird "process". I have to look into this $input variable...

    by sstranger at 2013-02-01 12:57:43

    Why don't you use Filter instead of Function? $_ elements are processed right away.

    by Lembasts at 2013-02-04 14:20:44

    [quote="sstranger"]Why don't you use Filter instead of Function? $_ elements are processed right away.[/quote]

    Not quite sure what you mean..sorry...

    by nohandle at 2013-02-04 23:42:10

    The filter is 'abandoned' concept that creates basically a named foreach-object or (related to our discussion) a function that has a process block specified. Whether or not to use it is not, at least from my point of view, the topic of this discussion.

    by Lembasts at 2013-02-06 16:49:11

    I just looked into the $input variable for functions.
    It appears one has the choice to use a Process block in which case the $input is not available, or use the Process block.
    Why would one use one technique over another? At first glance they appear to do the same thing e.g.

    Process
    function temp {
    foreach ($process in $input) {
    $process.name
    }
    }
    get-process | temp

    $input
    function temp {
    process {
    $_.name
    }
    }
    get-process | temp

    by Lembasts at 2013-02-07 13:06:17

    I just figured out one answer to my question. I am working on a solution to create a database update program which will be fed a transaction file in the form of objects.
    My first attempt was to map properties to parameters and use Begin, Process, End. However, that update cmdlet (advanced function) was specific to the input.
    I want to make the update program more generic – meaning I dont know exactly what properties are being passed – Ill have to enumerate the properties of the incoming object.
    In that case I can only use $input and cannot use a Process block?

    by Lembasts at 2013-02-11 13:48:57

    Enumerating the properties of an incoming object is not that easy!
    I first realised that to use the $input variable, you really have to declare a dummy parameter like:
    [Parameter(ValueFromPipeline=$true)]
    $InputObject

    ..even though you wont be using $inputobject.

    I also decide that $input is not that good as it waits for the pipeline input to be completed.
    So I tried to use the Begin, Process and End blocks and tried to enumerate the incoming object:
    $props = $InputObject | gm -MemberType NoteProperty | select -expand name
    The above doesnt work as it says there is no object???
    So how can I enumerate the incoming property names of a pipeline object before I start working on the pipeline objects?
    Is it possible?

    by Lembasts at 2013-02-11 14:40:03

    Thanks to "the lonely administrator", I managed to extract some of his code to achieve what I want – enumerating incoming pipeline object properties.
    Here is my test sample:
    function Test-Function
    {

    [CmdletBinding()]
    param
    (
    [Parameter(ValueFromPipeline=$true)]
    [object]$InputObject
    )

    begin
    {
    $properties=@()
    }

    process
    {
    if (-Not $properties) {

    #get properties for the pipelined object
    $properties=$_ | Get-Member -membertype Properties | select -expand name
    if ($properties -contains "Company") { write-verbose "incoming object has a company property"}

    }

    }

    end
    {
    }
    }
    get-process | test-function -verbose

You must be logged in to reply to this topic.