Memory consumption using background jobs

Welcome Forums General PowerShell Q&A Memory consumption using background jobs

This topic contains 6 replies, has 4 voices, and was last updated by

 
Participant
3 months, 1 week ago.

  • Author
    Posts
  • #134235

    Participant
    Topics: 1
    Replies: 2
    Points: -2
    Rank: Member

    I have a script that gets all the chromebooks in our school and syncs them on an on-prem SQL table, I'm trying to make a script that will process the devices in parallel using background jobs. The script seems to work well, it takes the 100 devices that the Google API returns per page and runs a SQL query to add them to SQL.

    Since each page returns 100 devices each background job runs 100 queries then finishes. the problem I'm having is that even though I'm cleaning up all the variables at the end of the job and receiving each completed job each job is still using 15-20MB. When I get to 10 jobs memory usage is almost at 1GB, and even though i'm throttling the jobs to until 10 at time when a job finishes it seems to not release the ram so after 30 minutes I'm at 100% memory usage.

    Source Code Link:

  • #134510

    Moderator
    Topics: 8
    Replies: 736
    Points: 2,059
    Helping Hand
    Rank: Community Hero

    Start-Job creates a powershell.exe process to execute the code. You can try using PoshRSJob module instead, which uses threading.

    https://www.powershellgallery.com/packages/PoshRSJob/1.7.4.4

  • #134591

    Participant
    Topics: 1
    Replies: 2
    Points: -2
    Rank: Member

    Start-Job creates a powershell.exe process to execute the code. You can try using PoshRSJob module instead, which uses threading.

    https://www.powershellgallery.com/packages/PoshRSJob/1.7.4.4

    I just gave it a try, and although it did somewhat help a bit it's still using way too much ram to be usable. Back to the drawing board for me I guess.

  • #134615

    Participant
    Topics: 11
    Replies: 24
    Points: 70
    Rank: Member

    Maybe try moving the SQL connection code to outside the foreach loop. That way you create the connection once instead of once for every single device.

  • #134666

    Participant
    Topics: 0
    Replies: 78
    Points: 416
    Helping Hand
    Rank: Contributor

    Please check the bottlenecks using DMVs DM_EXEC_QUERY_STATS and DM_EXEC_SQL_TEXT.

    And use bulk insert instead of inserting each row at a time, that will drastically reduce the I/Os and time as well.

  • #134889

    Participant
    Topics: 1
    Replies: 2
    Points: -2
    Rank: Member

    After some help from the /r/PowerShell community they pointed me to the dbatools module which I used to do a bulk insert instead of one record at a time which solved the problem. See updated code below:

    • #134933

      Participant
      Topics: 0
      Replies: 78
      Points: 416
      Helping Hand
      Rank: Contributor

      That's true, bulk insert drastically reduce the I/O operations, and less impact on the resources. For your scenario, this is the ideal process.

      Thank you.

The topic ‘Memory consumption using background jobs’ is closed to new replies.

denizli escort samsun escort muğla escort ataşehir escort kuşadası escort