Author Posts

February 15, 2018 at 8:30 pm

Hello All,

I have been playing with this part of a script I am writing and I am stuck in a bottle neck.

I am working on getting a list of permissions on shared folders. This is roughly where I started:

Get-ChildItem "\\$FileServer\$share" -Directory -Recurse | select -ExpandProperty FullName -OutVariable DirectoryPaths |out-null

I found that this works, but on a file server holding a large number of shares get-childitem is very slow. So I changed my code to use:

$DirectoryPaths = [System.IO.Directory]::EnumerateDirectories("\\$FileServer\$share", '*.*', 'AllDirectories')

The above line is way faster. The issue is working with the variable $DirectoryPaths.
The next part of my script is a foreach loop and normally if I wanted to walk through the loop a line at a time I could close the loop right away "Foreach ($DirectoryPath in $Directorypaths){}" and run it thus populating the variable $Discoverypath. However I seem to have moved the bottle neck from get-childitem to my loop as it takes a long time to go through the the loop even if there is no code to execute in the loop.

The variable $DirectoryPaths is now the following type.

IsPublic IsSerial Name BaseType
——– ——– —- ——–
False False FileSystemEnumerableIterator`1 System.IO.Iterator`1[System.String]

A bit of a definition a long time is days not just a few more seconds. If someone has a better idea of a way to get this directory listing or somehow deal with the variable type I would be very grateful.

Thanks,
Scott

March 14, 2018 at 6:48 pm

You're dealing with a scripting language; loops are always a bottleneck, especially when you're dealing with zillions of objects. A better approach, if possible, would be to push the script via Invoke-Command and let it run on the remote machine rather than over SMB. But, zillions of objects is still zillions of objects. PowerShell just isn't optimized for max speed in that scenario.