Author Posts

February 6, 2016 at 4:40 pm

I have the below script in a file called export.ps1 which runs against a text files with user names. The challenge is now I have several files in the same directory called files1.txt,file2.txt,...,filen.txt. I need to do a get-childitem c:\files\*.txt and run the below ps script against each one of them in its own Powershell script with passing arguments and variables to each separate running script. Any ideas how to accomplish this? Note that the $date is a variable that will change and I will need to pass to the running scripts.

Add-PSsnapin Microsoft.Exchange.Management.PowerShell.E2010
$users = Get-Content c:\files\file1.txt
$date = "09/01/2015"

foreach ($user in $users) {
$Mailbox = Get-Mailbox -Identity $user

New-MailboxExportRequest -Mailbox $Mailbox -ContentFilter {Received -lt $date} -FilePath \\Server\File1\$Mailbox.pst
}

February 6, 2016 at 6:05 pm

Why do you need to call a separate powershell script for each file? Why not just create a foreach loop around the script you already have?

Add-PSsnapin Microsoft.Exchange.Management.PowerShell.E2010
Get-ChildItem c:\files\*.txt | ForEach-Object {
    $users = Get-Content $_.fullname
    etc.
    etc.
    etc.
}

February 6, 2016 at 7:07 pm

Running multiple instance will finish faster instead running things serially. The number of users split across the text files can be in the hundreds or maybe thousands.

February 6, 2016 at 7:44 pm

You can use PSJob cmdlets (IE Start-Job), but probably more efficient would be to use runspaces (https://mcpmag.com/articles/2015/08/06/multithread-your-commands.aspx).

February 6, 2016 at 7:52 pm

I am completely new to that approach. I will have to read little bit more about it.

Thanks