Memory consumption /SQL Query

This topic contains 6 replies, has 3 voices, and was last updated by Profile photo of Arestas Arestas 1 year, 5 months ago.

  • Author
    Posts
  • #30316
    Profile photo of Arestas
    Arestas
    Participant

    Hello all,

    Need help in two different points.

    Point1:
    Is it normal for a powershell script to consume 15GB memory when running via schedule task scrip and for the same amount of work consuming 2G in powershell ISE console?

    Point2:
    How can i get 1000 rows each time from database, then do what i need to do with them and after i finish get the next 1000 rows.... and so on, and so on... Is there any good article about this?

    Regards,
    Arestas

  • #30324
    Profile photo of Jonathan Warnken
    Jonathan Warnken
    Participant

    For Point 1 – Is it normal? That depends on the script and how it manages the .net objects in memory. I have had issues with long running scripts managing large objects that the only way to address the issue was to trigger the .net garbage collector.

    [GC]::Collect()
  • #30347
    Profile photo of Rob Simmers
    Rob Simmers
    Participant

    I've seen this same behavior before to where I would run a script without issue in ISE and then schedule it as a task and it would just stop working in the middle of the script. I tried processing steps in stages, clearing variable, etc. and it didn't help. I wish I knew about @Jonathan tip above cause I was desperate for a solution. I was processing about 30K AD records and it would randomly just die around mid-way through.

  • #30383
    Profile photo of Arestas
    Arestas
    Participant

    I will tryJonathan option. (Its the only option i have :P) Should i use it at the end of each function?

    Tks for the fast awnsers

    Arestas

  • #30386
    Profile photo of Jonathan Warnken
    Jonathan Warnken
    Participant

    You would need to look at your script to determine what is grabbing the memory and when it should be released and add it afterwards. This is not going to fix the issue just make it better.
    here is a demo to see the impact

    Write-Output "Before Creating Large Object"
    Get-Process powershell*|Format-Table -AutoSize
    $sb = {
    $timer = [diagnostics.stopwatch]::startnew()
    $timer.reset()
    for ($i = 1; $i -le 100000; $i++) {
    		$timer.start()
    		$count ++
    		if ($count -ge 1000) { 
                write-progress -activity "Building Big Things" -status "$(($i/100000)*100)% Complete:" -percentcomplete (($i/100000)*100)
    			
    			$count = 0
    			$timer.reset()
    		}
    		New-Object Psobject -Property @{Dummy = $i}
    }
    }
    
    $x = &$sb
    Write-Output "After Creating Large Object"
    Get-Process powershell*|Format-Table -AutoSize
    Remove-Variable x
    Write-Output "After Removing Large Object"
    Get-Process powershell*|Format-Table -AutoSize
    Start-Sleep -Seconds 1
    Write-Output "Before Cleanup"
    Get-Process powershell*|Format-Table -AutoSize
    [System.GC]::Collect()
    Write-Output "After Cleanup"
    Get-Process powershell*|Format-Table -AutoSize
    
  • #30387
    Profile photo of Jonathan Warnken
    Jonathan Warnken
    Participant

    Also for point 2
    That is called paging and is beyond my ability to mock up right now. You can see some examples here http://social.technet.microsoft.com/wiki/contents/articles/23811.paging-a-query-with-sql-server.aspx

  • #30410
    Profile photo of Arestas
    Arestas
    Participant

    Tanks for all the help Jonathan!

You must be logged in to reply to this topic.