Find lastWriteTime and LastAccessTime for all subfolders

This topic contains 1 reply, has 2 voices, and was last updated by  Don Jones 3 months ago.

  • Author
    Posts
  • #93895

    Tracy Olsen
    Participant

    With my recent promotion I've inherited updating older scripts (like my recent question about logging) to make them better so to speak.

    I have a script that will check the LastWriteTime and LastAccessTime of a single folder and all contents.

    $foldername=Read-Host "Please enter a folder Name"
    $LWT = dir -Recurse "path\$foldername" | 
        ?{$_.GetType() -ne [System.IO.DirectoryInfo]} | 
        sort -prop LastWriteTime | 
        Select -last 1 LastWriteTime 
    
    $LAT = dir -Recurse "path\$foldername" | 
        ?{$_.GetType() -ne [System.IO.DirectoryInfo]} | 
        sort -prop LastAccessTime | 
        Select -last 1 LastAccessTime
    
    write-host "($LWT) ----- ($LAT)"
    

    I need to see if i can change this to look at all the folders in the project directory, to save me and my co-worker a few hours of entering a single folder at a time with this current script.

    Problem I've ran into is an error saying it can't find a file name in one of the folders, digging into this it's not that it can't find it, the file name is too long. Unfortunately i don't have that error available to share.

    Here's what I've tried so far –

    clear
    #$foldername=Read-Host "Please enter a folder Name"
    $Desktop = [Environment]::GetFolderPath("Desktop")
    $path = "folder path" 
    $skip = "path to root folders in the directory that gave errors"
    
    Try {
    Get-ChildItem -Path $path -Exclude $skip -Recurse -ErrorAction SilentlyContinue |
        ?{$_.GetType() -ne [System.IO.DirectoryInfo]} | 
        sort -prop LastWriteTime,LastAccessTime | 
        Select -last 1 LastWriteTime,LastAccessTime | 
        Select-Object LastWriteTime,LastAccessTime | 
        Export-Csv "$Desktop\Folder_Access_list.csv"
    }
    Catch {
        "Get-ChildItem"
    }
    Finally {
        Out-File "C:\Folder Archive\folder_use_errors.txt"
    }
    

    I've had that script running for an hour now in a test folder with 7 folders, each of the 7 folders has 3 or 4 sub folders, and those folders have a few subs, and then each of those has 10-15 documents. Total size of all of this is just under a gig of data. I've had the single folder script run on a couple gigs of data with hundreds more files/folders and complete in a few minutes, give or take. This leads me to believe my script isn't working right.

    The actual project folder i'll be using this on has about 4 TB of data, almost 600 root folders, and countless thousands of sub folders. I plan to let it run all weekend next weekend, if i can get it to work.

    Ideas?

  • #94014

    Don Jones
    Keymaster

    Yeah, overly long filenames remain a problem in .NET, some 20 years after long filenames became normalized in Windows. There's not a ton you can do about it if it's _just_ the filename. If it's _the entire file path_, you can try mapping a PSDrive to the containing folder to reduce the path size.

You must be logged in to reply to this topic.