move files from very large folder in batches

Welcome Forums General PowerShell Q&A move files from very large folder in batches

This topic contains 3 replies, has 4 voices, and was last updated by

2 years, 2 months ago.

  • Author
  • #60847

    Points: 28
    Rank: Member

    I need to move 1,000 files per hour out of a very large folder. (600,000 files) Doing a get-childitem on the folder takes 10 minutes. IS there a way to do this in 50 file batches till it reaches 1000.

    $FileLimit = 1000 
    #Destination for files
    $DropDirectory = "c:\targetFolder\"
    $PickupDirectory = Get-ChildItem -Path "c:\sourceFolder"
    #I think the line below will perform better than get-childitem
    #$PickupDirectory =  [System.IO.Directory]::EnumerateFiles("C:\sourceFolder") 
    $Counter = 0
    foreach ($file in $PickupDirectory)
        if ($Counter -ne $FileLimit)
            $Destination = $DropDirectory+$file.Name
            Move-Item $file.FullName -destination $Destination
  • #60849

    Points: 1,811
    Helping HandTeam Member
    Rank: Community Hero

    Personally I'd use Robocopy for this. It's much better optimized for batch operations. Your use of foreach doesn't help; you might get a better result by using Select to limit the number of directories returned to the first 1000, and then piping directly to Move.

    But within PowerShell, no, there's no built in way of breaking a collection into specified, even chunks. You could instead consider a Workflow, which has parallel execution options and manages its own throttling. You could also get into parallel jobs or something. But you'd still need to devise a way of chunking the objects.

  • #60850

    Points: 5
    Rank: Member

    If you insist on using Get-ChildItem you could split the files up alphabetically and use -Filter (-Filter a*, then -Filter b* and so on). For a Get-ChildItem on my Windows directory this cut execution time by at least 50% (but that's with subdirectories and access-denied problems, so gains on a folder containing only files to which you are sure to have access might be better).
    If you pipe that result to a Select -First 1000 you may reach acceptable execution times, but to be honest I never tried this on a directory with 600.000 files in it.

  • #60853

    Points: 0
    Rank: Member

    I see many problems in your script, but it'd better avoid mentioning them. The important, is that the greatest throughput requires in general, minimum scripting code. Thus, I suggest this:

    Move-Item c:\SourceFolder\* c:\TargetFolder  # use -WhatIf , if you want to check it's alright

    That's it, nothing more. You get what you get, and if it suffices, then, it's OK.

    Notice, you'll need quotes around the path/destination if there is 1 ( or more ) spaces in their names.

The topic ‘move files from very large folder in batches’ is closed to new replies.

denizli escort samsun escort muğla escort ataşehir escort kuşadası escort