Backup SP Online document library locally

This topic contains 5 replies, has 4 voices, and was last updated by  Joey 5 months ago.

  • Author
    Posts
  • #35364

    Scott Bunting
    Participant

    We currently use a 3rd party software to backup our SP document libraries locally to a NAS – it does so through a mapped drive (WebDAV) – this is problematic for many reasons.

    Should I be able to create a powershell script that connect to a document library and copies the files locally? Then on subsequent runs, copies only file that changed? I'm looking to have a sync'd copy locally of the SP document library.

    I know how to connect to the SP Online site with Powershell, but once connected then what? Can I use robocopy to then copy the documents? If so, how does robocopy address the doc. library? by URL once connected?

    I appreciate any thoughts on the topic.

    Thank you

  • #35366

    Bob McCoy
    Participant

    Have you looked at Robocopy? Comes with Windows and will save a lot of reinventing the wheel.

  • #35377

    Scott Bunting
    Participant

    Yes, I was thinking I could use Robocopy once connected. Wondering if anyone has done it before with SP Online doc libraries and aware of any gotchas. I'll try to hack my way through it this weekend and see what I can come up with.

  • #35403

    Ryan Yates
    Participant

    I would use the OfficeDevPnP PowerShell Cmdlets and make use of the Get-SPOFile Cmdlet to do this within a foreach loop with an if statement in there as well.

    Example would be

    $downloadLocation = 'C:\DownloadLocation'
    $date = (get-date).AddDays(-1)
    $ctx = Get-SPOContext
    $list = Get-SPOList -Identity "ListName"
    $list.RootFolder.Files
    $files = $list.RootFolder.Files
    $ctx.Load($files)`
    $ctx.ExecuteQuery()
    
    foreach ($file in $files) {
        if ($file.TimeLastModified -gt $date) { 
               Get-SPOFile -ServerRelativeUrl $file.ServerRelativeUrl -Path $DownloadLocation -Filename $file.Name }
    }
    

    That will do what your looking for and the OfficeDevPnP Cmdlets can be downloaded from the PowerShell Galllery with the following Command

    Install-Module OfficeDevPnP.PowerShell.V16.Commands
    

    or from Github directly https://github.com/OfficeDev/PnP-PowerShell

    Hopefully this will be useful for you

  • #68877

    Joey
    Participant

    I know this is an old post, but I ran into some errors when I tried to implement it. I've gotten to the point where I can download all files in the root folder. However, I'd like to be able to download all files in the root folder and all subfolders. I'm hoping you guys might have some pointers. Any help would be appreciated.

    $downloadLocation = 'C:\DownloadLocation'
    $date = (get-date).AddDays(-1)
    $ctx = Get-SPOContext
    $files = (Get-PNPList -Identity "ListName").RootFolder.Folders
    $ctx.Load($files)`
    $ctx.ExecuteQuery()
    foreach ($file in $files) {
    if ($file.TimeLastModified -gt $date) {
    Get-PNPFile -ServerRelativeUrl $file.ServerRelativeUrl -Path $DownloadLocation -Filename $file.Name -AsFile }
    }

    (Note that I went to the GitHub Repository and followed the directions there to load the module.)

  • #69051

    Joey
    Participant

    Edit: I don't know how to edit my post, so I'm just double posting.

    I know this is an old post, but I ran into some errors when I tried to implement it. I've gotten to the point where I can download all files in the root folder. However, I'd like to be able to download all files in the root folder and all subfolders. I'm hoping you guys might have some pointers. Any help would be appreciated.

    $downloadLocation = 'C:\DownloadLocation'
    $date = (get-date).AddDays(-1)
    $ctx = Get-SPOContext
    $files = (Get-PNPList -Identity "ListName").RootFolder.Files
    $ctx.Load($files)`
    $ctx.ExecuteQuery()
    foreach ($file in $files) {
    if ($file.TimeLastModified -gt $date) {
    Get-PNPFile -ServerRelativeUrl $file.ServerRelativeUrl -Path $DownloadLocation -Filename $file.Name -AsFile }
    }
    

    Thanks again!

You must be logged in to reply to this topic.