Memory consumption using background jobs

Welcome Forums General PowerShell Q&A Memory consumption using background jobs

This topic contains 6 replies, has 4 voices, and was last updated by

 
Participant
6 days, 15 hours ago.

  • Author
    Posts
  • #134235

    Participant
    Points: -3
    Rank: Member

    I have a script that gets all the chromebooks in our school and syncs them on an on-prem SQL table, I'm trying to make a script that will process the devices in parallel using background jobs. The script seems to work well, it takes the 100 devices that the Google API returns per page and runs a SQL query to add them to SQL.

    Since each page returns 100 devices each background job runs 100 queries then finishes. the problem I'm having is that even though I'm cleaning up all the variables at the end of the job and receiving each completed job each job is still using 15-20MB. When I get to 10 jobs memory usage is almost at 1GB, and even though i'm throttling the jobs to until 10 at time when a job finishes it seems to not release the ram so after 30 minutes I'm at 100% memory usage.

    Source Code Link:

  • #134510

    Participant
    Points: 1,124
    Helping Hand
    Rank: Community Hero

    Start-Job creates a powershell.exe process to execute the code. You can try using PoshRSJob module instead, which uses threading.

    https://www.powershellgallery.com/packages/PoshRSJob/1.7.4.4

  • #134591

    Participant
    Points: -3
    Rank: Member

    Start-Job creates a powershell.exe process to execute the code. You can try using PoshRSJob module instead, which uses threading.

    https://www.powershellgallery.com/packages/PoshRSJob/1.7.4.4

    I just gave it a try, and although it did somewhat help a bit it's still using way too much ram to be usable. Back to the drawing board for me I guess.

  • #134615

    Participant
    Points: 56
    Rank: Member

    Maybe try moving the SQL connection code to outside the foreach loop. That way you create the connection once instead of once for every single device.

  • #134666

    Participant
    Points: 282
    Helping Hand
    Rank: Contributor

    Please check the bottlenecks using DMVs DM_EXEC_QUERY_STATS and DM_EXEC_SQL_TEXT.

    And use bulk insert instead of inserting each row at a time, that will drastically reduce the I/Os and time as well.

  • #134889

    Participant
    Points: -3
    Rank: Member

    After some help from the /r/PowerShell community they pointed me to the dbatools module which I used to do a bulk insert instead of one record at a time which solved the problem. See updated code below:

    • #134933

      Participant
      Points: 282
      Helping Hand
      Rank: Contributor

      That's true, bulk insert drastically reduce the I/O operations, and less impact on the resources. For your scenario, this is the ideal process.

      Thank you.

You must be logged in to reply to this topic.