Help with recursively deleting old files with a twist

This topic contains 0 replies, has 1 voice, and was last updated by Profile photo of Forums Archives Forums Archives 5 years, 5 months ago.

  • Author
    Posts
  • #5622

    by rreadenour at 2013-03-04 12:31:58

    Hi all. I am trying to delete files and folders older than 2 days old from a folder structure that includes various user's names.
    For example:
    D:\saswork\user1\folder1
    D:\saswork\user2\folder1
    D:\saswork\user3\folder1, etc.

    I need to delete the folder1 folder and all files and folders under it that are 2 days old or older.

    This is what I have so far. It will delete files and folders older than 2 days, but I can't figure out how to get around the different userid-named folders.
    Can anyone help?
    Thanks!

    Function RemoveOldFilesAndFolders

    {

    param ([string]$TargetFolder,[string]$DaysOld)

    $CurrentDate = Get-Date
    $DateToDelete = $CurrentDate.AddDays(-$DaysOld)

    Get-ChildItem $TargetFolder -recurse | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item -Recurse

    }

    RemoveOldFilesAndFolders "d:\saswork" 2

    by rholloman at 2013-03-04 13:10:18

    I had the task of cleaning up the "Home" folders of previous employees in my company that haven't been written to in 90 days. These folders were spread across different servers, and were many, many levels deep. So deep, in fact, that many paths to the deepest directory exceeded the 255 character limit that Windows employs to remain happy on the command line. The solution that I came up with renamed every directory to a random integer between 1 and 9999. This removed the the long directory names as well as the spaces that were include in them. It also sorted the list and deleted recursively.

    This is what I came up. Let me know if you have any questions.

    $Now = Get-Date

    $Days = "90"
    $LastWrite = $Now.AddDays(-$Days)
    [string]$LocationFile =
    $LocationList = Get-Content $LocationFile
    $min = [int] '1'
    $max = [int] '9999'

    $LocationList | ForEach-Object {
    Get-ChildItem $_ | where {$_.psIsContainer -eq $true -and $_.LastWriteTime -lt $LastWrite} |
    ForEach-Object {
    Get-ChildItem $_.PSPath -Recurse |
    Where {$_.psIsContainer -eq $true} | %{$_.FullName} |
    Sort-Object -Property Length -Descending |
    ForEach-Object {
    Write-Host $_
    $Item = Get-Item $_
    $PathRoot = $Item.FullName | Split-Path
    $OldName = $Item.FullName | Split-Path -Leaf
    $Random = Get-Random -Minimum $min -Maximum $max
    $NewName = $Random
    $NewPath = $PathRoot | Join-Path -ChildPath $NewName
    Rename-Item -Path $Item.FullName -NewName $NewPath
    }
    Remove-Item $_.PSPath -Force -Recurse
    }
    }

    Thanks,
    Robbie

    by mjolinor at 2013-03-04 13:24:45

    This part is a little confusing:

    "I need to delete the folder1 folder and all files and folders under it that are 2 days old or older. "

    If you delete that folder then everything under it, regardless of how old it is, goes with it.

    by scottbass at 2013-03-04 19:07:54

    I'm working on pretty much the same thing, but taking a different approach than many of the "remove files older than X" scripts I've seen.

    IMO, the problem statement can be split as follows:

    1) Get a desired collection of files
    2) Do something with them

    If you keep #1 and #2 separate, then #1 can be useful in a number of scenarios. In my case, I want to archive (7-zip) the files, then delete them if the archive was successful.

    See attached Get-Files.ps1 script (function). If this gives you the list of files you want, then you can just pipe the results to Remove-Item.

    I think the syntax (untested) would be:

    Get-Files -path D:\saswork\*\folder1 -age 2 -recurse | Remove-Item

    although this may not delete empty folders. Try it and see.

    See also viewtopic.php?f=2&t=1402.

    (BTW, I'm primarily a SAS programmer. This looks very much like the SAS "cleanwork" processing. If so, I've got a cleanwork script that may be of interest. PM me for more details).

    HTH...

    by scottbass at 2013-03-05 23:14:39

    See viewtopic.php?f=2&t=1455, perhaps it will help?

You must be logged in to reply to this topic.