Capturing all Get-Job IDs from running background jobs for Sharepoint backup

Welcome Forums General PowerShell Q&A Capturing all Get-Job IDs from running background jobs for Sharepoint backup

Viewing 10 reply threads
  • Author
    Posts
    • #252674
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      Hi folks. We have developed a script in our team that successfully starts and completes the backup of all our Sharepoint on-prem site collections. The backup via Powershell is successful. But we are trying to capture how long a backup job takes, the site URL and Site size and export to CSV file. Now when i kickoff my script, which is pasted below, i can execute the $running and see PSBegintime and PSEndTime by executing $running[0] | select *. The problem is that i’m not able to see what Site is being backed up, the “Command” property within $running[0] shows the script block we ran, but the $variables in there are not filled in with real values, they only show variable names (so no help). I found that i can figure out how long a job ran but executing this right below.

      ((Get-Job Job8).PSEndTime.TimeOfDay – (Get-Job Job8).PSBeginTime.TimeOfDay).TotalMinutes

      What i’m having a difficult with is grabbing all Job ID #s that were created and are in either Running or Completed state. I wanted to add them to the empty array of $backupJobs. How will i be able to pull the Site URL from the running/completed Jobs? Thank you for any assistance

       

    • #252680
      Participant
      Topics: 0
      Replies: 77
      Points: 343
      Helping Hand
      Rank: Contributor

      How about use the following:  Then Name will contain the site.

       

    • #252683
      Participant
      Topics: 7
      Replies: 557
      Points: 2,122
      Helping Hand
      Rank: Community Hero

      I would probably track the start and end time with each job as properties. Then you’re end object will include the start/stop times. You could also provide an ID for each job when starting. Can you show where you start the job?

    • #252689
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      I would probably track the start and end time with each job as properties. Then you’re end object will include the start/stop times. You could also provide an ID for each job when starting. Can you show where you start the job?

      Hi Doug. The part where we start parsing thru the URL link is “Start-Job -InputObject $site.Url -Name $site.Url {…….}”

      The backup starts at “backup-spsite -Identity $url -Path d:\PSBackups\test\Data\$filename -force -NoSiteLock ”

      Thank you for replying.

    • #252686
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      How about use the following: Then Name will contain the site.

      Thank you AdminofThings45! That did the trick to get the site.URL value to come up. Would you know how to pull all Job IDs, including running and/or complete state?

    • #252704
      Participant
      Topics: 0
      Replies: 77
      Points: 343
      Helping Hand
      Rank: Contributor

      Get-Job | Select -Expand Id will get all of the job IDs

    • #252707
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      Get-Job | Select -Expand Id will get all of the job IDs

      So the backup script created 3 .dat files from 3 site collections, but when i run Get-Job | select -expand Id, i only get 2 job Ids that are returned. I also ran the following below and get only 2 listed Jobs. Where is the missing 3rd Job?

      (Get-Job | where { $_.JobStateInfo.State -ne “” })

      Thank you!

    • #252710
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      Changing the JobThreshold to ‘3’, and then running it, returns 3 Job Ids now.

    • #253157
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      Hi folks, so i went ahead and ran a backup against 50 site collections ($sites = get-spsite -limit 50) and set the $jobthreshold limit to ‘7’.

      The backup does complete successfully, but when i run the code block below, i only get 7 Job Ids stored in the $hashtable array. Why is the code not capturing all the Job Ids?  Also running “get-job | select -ExpandProperty Id” only lists 7 Job Ids, nothing more. I’m not sure why not all 50 job Ids are shown? $alljobs.count = 7. Thank you for any assistance.

    • #253160
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      I should have mentioned in above post, that in PROD, we want to limit the jobthreshold number to a maximum of 7, and not go above that. We have about 250 site collections to backup and can not set job threshold to 250. Our server resources hitting the single SQL server will cause an outage with 250 concurrent jobs running in tandem. Sorry for forgetting to put this.

    • #253397
      Participant
      Topics: 10
      Replies: 31
      Points: 82
      Rank: Member

      I figured the issue. Had to add the parameter -Keep after Receive-Job. Also added the line to right after the comment “Completed Jobs” to gather all jobs with {JobstateInfo.State -notEquals “”} into the $backupJobs array. Added a do While statement for script to wait until all Running state jobs count equaled zero. Then I removed the $allJobs array and sorted the $backupJobs by Id# and -Unique. All the jobs had data in the “HasMoreData” property. Was able to get all jobs. Also commented out the Remove-job temporarily. Thanks

Viewing 10 reply threads
  • You must be logged in to reply to this topic.