File copy questions

This topic contains 3 replies, has 3 voices, and was last updated by  Michael 3 months, 2 weeks ago.

  • Author
    Posts
  • #92501

    Jon
    Participant

    I was transferring around 2TB of data (probably 20 million small files) from one server to the other, and of course we had a network outage when I was about 80% done. With this many small files it took a loooooong time to do this, so I don't want to start over. In order to not have to completely start over, I was thinking of doing something like this:

    $filestoexclude = gci C:\path -recurse
    
    Copy-item "\\server1\path" -Destination C:\path -recurse -exclude $filestoexclude
    

    Just wondering if anyone has any critique or a better way to do this, thanks!

  • #92513

    Olaf Soyk
    Participant

    Use robocopy! It's made for. 😉

    • #92521

      Jon
      Participant

      Looks like the /xo /xn options should take care of this. Thanks!

  • #92527

    Michael
    Participant

    As you stated that the previous attempt was aborted because of a network outage, you should ensure that the files that were being copied to the destination are the same as what they are on the source, and not a corrupted form of it because of the network disruption. You can loop through each file from the source and compare it to the destination and only copy it if it is different on the destination. Otherwise, ROBOCOPY has this functionality using the /MIR argument if you do not require the use of PowerShell.
    PowerShell:
    http://www.tomsitpro.com/articles/powershell-sync-folders,2-879.html

    ROBOCOPY:
    https://docs.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy

You must be logged in to reply to this topic.