Author Posts

December 26, 2017 at 8:11 pm

Hi everyone

I have job task to upload files to AWS and archive those file after the upload is complete.

The source files are on the next drive. //pickup/location/files (looking for files with .cli and .txt files extension

The archive location //getfiles/whenyourdone/archive

I have the AWS password and secret keys.

I would gladly appreciate if someone can send me a sample that I can modify.

December 27, 2017 at 1:21 am

1 – Have you installed the AWS PowerSHell cmdlets.
'aws.amazon.com/powershell'
'docs.aws.amazon.com/powershell/latest/userguide/pstools-using.html'

... or are you trying to send to an AWS S3bucket or some file server / ftp server / WebDAV server?

2 – AWS give instructions on how to use their API for your use case.
'docs.aws.amazon.com/AmazonS3/latest/dev/HLuploadFileDotNet.html'
'docs.aws.amazon.com/powershell/latest/userguide/pstools-s3-upload-object.html'

3 – As for PoSH, you could try something like this (this is S3 thing)

'docs.aws.amazon.com/powershell/latest/userguide/pstools-using.html'
'aaronmedacco.com/blog/post/2017/02/25/powershell-script-for-uploading-a-local-directory-to-an-s3-bucket-on-aws'

Or this...

$BucketName = "myS3Bucket"
$s3Directory = "C:\users\$env:username\documents\s3test"
$concurrentLimit = 5
$inProgressFiles = @()

foreach ($i in Get-ChildItem $s3Directory)
{
# Write the file to S3 and add the filename to a collection.
Write-S3Object -BucketName $BucketName -Key $i.Name -File $i.FullName
$inProgressFiles += $i.Name

# Wait to continue iterating through files if there are too many concurrent uploads
while($inProgressFiles.Count -gt $concurrentLimit)
{
Write-Host "Before: "$($inProgressFiles.Count)

# Reassign the array by excluding files that have completed the upload to S3.
$inProgressFiles = @($inProgressFiles | ? { @(get-s3object -BucketName $BucketName -Key $_).Count -eq 0 })

Write-Host "After: "$($inProgressFiles.Count)

Start-Sleep -s 1
}
Start-Sleep -s 1
}

December 29, 2017 at 8:05 pm

Yes, I'm trying to send to an AWS S3bucket or some file server / ftp server / WebDAV server.

August 10, 2018 at 4:42 am

I have followed this script .we have uploaded daily MSSQL backup to S3 but CPU consume full CPU. Daily backup size comes around 120GB. Please help me to solve this issue. Attached an image for your reference. I have added -ConcurrentServiceRequest  to "1" but still issue persist.

 

powershellCPU_full