Author Posts

April 13, 2015 at 6:44 am

I'm trying to move any files that have not been access in two years to lower tier storage and I'm not sure the best way to make the moves. I would like to keep the directory structure intact and I have a feeling I should be using robocopy but I'm not sure the correct way to pipe the results of my script in to a robocopy script.

This is the line I'm using the find the files.

$cutOffDate = (Get-Date).AddDays(-730)
Get-ChildItem -Path '\\file-server\s$\files\' -Recurse | Where-Object {$_.LastAccessTime -le $cutOffDate} | Select-Object -Property Name,LastAccessTime | Out-file -FilePath "C:\Temp\FilesLastAccessed.txt"

Any pointers on how to dump the results into a robocopy script or achieve my intended result?

April 13, 2015 at 8:02 am

The following would do what you want, but would fail if a file is open or locked
[code]ROBOCOPY C:\SourceFolder C:\destination /move /minlad:730[/code]

April 13, 2015 at 8:09 am

Keep in mind that LastAccessed* information isn't always what you want. When Antivirus programs scan files, it can change those values for instance. Typically, you want to use the LastModified properties for comparisons that indicate the last time something was changed in a file.

April 13, 2015 at 8:40 am

That did not work for some reason and I can't see why. Very strange indeed.

When I run my Powershell script, I get over 1600 results, which is even more puzzling.

April 13, 2015 at 8:41 am

Keep in mind that LastAccessed* information isn't always what you want. When Antivirus programs scan files, it can change those values for instance. Typically, you want to use the LastModified properties for comparisons that indicate the last time something was changed in a file.

While I would normally agree with you, we have a large number of files that are accessed but not edited and moving them could break some processes.