How to upload files to AWS using a PowerShell script?

Welcome Forums General PowerShell Q&A How to upload files to AWS using a PowerShell script?

This topic contains 3 replies, has 3 voices, and was last updated by

 
Participant
4 months, 1 week ago.

  • Author
    Posts
  • #90043

    Participant
    Points: 0
    Rank: Member

    Hi everyone

    I have job task to upload files to AWS and archive those file after the upload is complete.

    The source files are on the next drive. //pickup/location/files (looking for files with .cli and .txt files extension

    The archive location //getfiles/whenyourdone/archive

    I have the AWS password and secret keys.

    I would gladly appreciate if someone can send me a sample that I can modify.

  • #90059

    Participant
    Points: 320
    Helping Hand
    Rank: Contributor

    1 – Have you installed the AWS PowerSHell cmdlets.
    'aws.amazon.com/powershell'
    'docs.aws.amazon.com/powershell/latest/userguide/pstools-using.html'

    ... or are you trying to send to an AWS S3bucket or some file server / ftp server / WebDAV server?

    2 – AWS give instructions on how to use their API for your use case.
    'docs.aws.amazon.com/AmazonS3/latest/dev/HLuploadFileDotNet.html'
    'docs.aws.amazon.com/powershell/latest/userguide/pstools-s3-upload-object.html'

    3 – As for PoSH, you could try something like this (this is S3 thing)

    'docs.aws.amazon.com/powershell/latest/userguide/pstools-using.html'
    'aaronmedacco.com/blog/post/2017/02/25/powershell-script-for-uploading-a-local-directory-to-an-s3-bucket-on-aws'

    Or this...

    $BucketName = "myS3Bucket"
    $s3Directory = "C:\users\$env:username\documents\s3test"
    $concurrentLimit = 5
    $inProgressFiles = @()

    foreach ($i in Get-ChildItem $s3Directory)
    {
    # Write the file to S3 and add the filename to a collection.
    Write-S3Object -BucketName $BucketName -Key $i.Name -File $i.FullName
    $inProgressFiles += $i.Name

    # Wait to continue iterating through files if there are too many concurrent uploads
    while($inProgressFiles.Count -gt $concurrentLimit)
    {
    Write-Host "Before: "$($inProgressFiles.Count)

    # Reassign the array by excluding files that have completed the upload to S3.
    $inProgressFiles = @($inProgressFiles | ? { @(get-s3object -BucketName $BucketName -Key $_).Count -eq 0 })

    Write-Host "After: "$($inProgressFiles.Count)

    Start-Sleep -s 1
    }
    Start-Sleep -s 1
    }

  • #90283

    Participant
    Points: 0
    Rank: Member

    Yes, I'm trying to send to an AWS S3bucket or some file server / ftp server / WebDAV server.

  • #108113

    Participant
    Points: 1
    Rank: Member

    I have followed this script .we have uploaded daily MSSQL backup to S3 but CPU consume full CPU. Daily backup size comes around 120GB. Please help me to solve this issue. Attached an image for your reference. I have added -ConcurrentServiceRequest  to "1" but still issue persist.

     

    powershellCPU_full

     

The topic ‘How to upload files to AWS using a PowerShell script?’ is closed to new replies.