I have job task to upload files to AWS and archive those file after the upload is complete.
The source files are on the next drive. //pickup/location/files (looking for files with .cli and .txt files extension
The archive location //getfiles/whenyourdone/archive
I have the AWS password and secret keys.
I would gladly appreciate if someone can send me a sample that I can modify.
1 – Have you installed the AWS PowerSHell cmdlets.
... or are you trying to send to an AWS S3bucket or some file server / ftp server / WebDAV server?
2 – AWS give instructions on how to use their API for your use case.
3 – As for PoSH, you could try something like this (this is S3 thing)
$BucketName = "myS3Bucket"
foreach ($i in Get-ChildItem $s3Directory)
# Wait to continue iterating through files if there are too many concurrent uploads
# Reassign the array by excluding files that have completed the upload to S3.
Write-Host "After: "$($inProgressFiles.Count)
Start-Sleep -s 1
Yes, I'm trying to send to an AWS S3bucket or some file server / ftp server / WebDAV server.
I have followed this script .we have uploaded daily MSSQL backup to S3 but CPU consume full CPU. Daily backup size comes around 120GB. Please help me to solve this issue. Attached an image for your reference. I have added -ConcurrentServiceRequest to "1" but still issue persist.
You must be logged in to reply to this topic.