File copy questions

Welcome Forums General PowerShell Q&A File copy questions

This topic contains 3 replies, has 3 voices, and was last updated by

 
Participant
9 months, 2 weeks ago.

  • Author
    Posts
  • #92501
    Jon

    Participant
    Points: 24
    Rank: Member

    I was transferring around 2TB of data (probably 20 million small files) from one server to the other, and of course we had a network outage when I was about 80% done. With this many small files it took a loooooong time to do this, so I don't want to start over. In order to not have to completely start over, I was thinking of doing something like this:

    $filestoexclude = gci C:\path -recurse
    
    Copy-item "\\server1\path" -Destination C:\path -recurse -exclude $filestoexclude
    

    Just wondering if anyone has any critique or a better way to do this, thanks!

  • #92513

    Participant
    Points: 135
    Helping Hand
    Rank: Participant

    Use robocopy! It's made for. 😉

    • #92521
      Jon

      Participant
      Points: 24
      Rank: Member

      Looks like the /xo /xn options should take care of this. Thanks!

  • #92527

    Participant
    Points: 0
    Rank: Member

    As you stated that the previous attempt was aborted because of a network outage, you should ensure that the files that were being copied to the destination are the same as what they are on the source, and not a corrupted form of it because of the network disruption. You can loop through each file from the source and compare it to the destination and only copy it if it is different on the destination. Otherwise, ROBOCOPY has this functionality using the /MIR argument if you do not require the use of PowerShell.
    PowerShell:
    http://www.tomsitpro.com/articles/powershell-sync-folders,2-879.html

    ROBOCOPY:
    https://docs.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy

The topic ‘File copy questions’ is closed to new replies.