ContinuationToken in Azure Automation Workflow

This topic contains 4 replies, has 2 voices, and was last updated by Profile photo of Tony Bulding Tony Bulding 2 weeks ago.

  • Author
    Posts
  • #72500
    Profile photo of Tony Bulding
    Tony Bulding
    Participant

    Greetings,

    I am trying to automate the backup of an azure blob container to another storage account. This process works fine outside of a workflow, however, inside the workflow I get an error when I try to pass in the continuation token after the first batch is copied.
    Get-AzureStorageBlob : Cannot bind parameter 'ContinuationToken'. Cannot convert the
    "Microsoft.WindowsAzure.Storage.Blob.BlobContinuationToken" value of type "System.String" to type
    "Microsoft.WindowsAzure.Storage.Blob.BlobContinuationToken".
    My understanding is that the returned values are deserialized and this causes the error.
    To try and address the issue, I am using 'InlineScript' to attempt to get a value unaltered.
    Here is the snippet....
    $token = InlineScript{
    $blobs = $using:blobs
    $token = ($blobs[$blobs.Count -1].ContinuationToken)
    $token
    }
    this yields the same result as when I do this:
    $token = ($blobs[$blobs.Count -1].ContinuationToken)

    Any ideas?

  • #72613
    Profile photo of Don Jones
    Don Jones
    Keymaster

    Yeah, workflows are special beasts and they run into this kind of problem. Serialization can be a problem, as can you not having the necessary class assemblies loaded where the script is running, as is the way workflow persists and passes information around within itself. Is there any way to not use a workflow?

  • #72622
    Profile photo of Tony Bulding
    Tony Bulding
    Participant

    Hello Don,

    Thanks for the response. BTW I had a great time at Summit, looking forward to next year.
    I am open to alternatives other than workflow, I was hoping to leverage parallelism to help with the data transfer. I can run the script as a normal PowerShell job, and it works, however, the number and quantity of blobs, (1.5TB) cause the job to get evicted.
    My next attempt will be with Azure Functions, I'll let you know how that plays out.

    Thanks

  • #72625
    Profile photo of Don Jones
    Don Jones
    Keymaster

    Yeah, I'm not sure Azure Functions is what you're after. That's not an administrative automation service, it's for developers to host code that they'd all from within their applications. Kind of like the API Service, actually.

    Could you launch multiple jobs, each one handling only a single blob? That's your parallelization. Unfortunately, if 1.5TB still causes the job to get tossed, you're kind of screwed regardless. The PowerShell interfaces in Azure weren't really designed with mass data transfer in mind, and I'm not sure workflow would be any different.

    To be clear: you're attempting to run this as a workflow in Azure Automation, not as a local workflow in WWF?

  • #72626
    Profile photo of Tony Bulding
    Tony Bulding
    Participant

    Sounds like running the job from within a VM is going to have to remain, (On the bright side, I could dedicate a vm to run more of these sort of jobs) I will continue exploring the Azure Functions path, just in case.
    I am running the workflow in Azure Autmation not locally in WWF.
    Thanks.

You must be logged in to reply to this topic.