Simple Get-ChildItem script to write into file stuck

Welcome Forums General PowerShell Q&A Simple Get-ChildItem script to write into file stuck

This topic contains 4 replies, has 3 voices, and was last updated by

 
Participant
4 days, 17 hours ago.

  • Author
    Posts
  • #133707

    Participant
    Points: 37
    Rank: Member

    Hello everyone,

    I have really simple script that should get foloder in specific folder older than N months.
    I wanted to log content of the folder that is going to be deleted into file.
    The problem is, that if the folder's content is really big (speaking about more than half million files and folders), the script creates log file, write there first line "Folders and files deleted on SomeDate" and then stuck.

    Is there any limitation to get content and write into file? Or anything else am I missing?
    Below is my script, I guess there is no error in it.

    $FoledrsOlderThan = -4
    $CurrentDate = Get-Date -Format MM-dd-yyyy
    $FoldersToDelete = Get-ChildItem -Path X:\TargetFolder\ | Where-Object {$_.CreationTime -lt (Get-Date).AddMonths($FoledrsOlderThan)} | Select-Object -ExpandProperty Name
    
    foreach ($FolderToDelete in $FoldersToDelete) {
        Write-Output "Folders and files deleted on $CurrentDate" | Out-File X:\RemovedFoldersLog\$CurrentDate-$FolderToDelete.txt -Append
        Get-ChildItem -Path X:\TargetFolder\$FolderToDelete -Recurse | Select-Object Name, CreationTime | Sort-Object -Property CreationTime | Out-File X:\RemovedFoldersLog\$CurrentDate-$FolderToDelete.txt -Append
        Get-Item -Path X:\TargetFolder\$FolderToDelete | Remove-Item -Recurse -Force
    }
    

    Thank you for any help or reply

  • #133763

    Participant
    Points: 211
    Helping Hand
    Rank: Participant

    Hi Filip РLooks like the resources in the box that you are using are not sufficient to handle this task all at once, and this task also depends on I/O and network throughputs (assuming X:\ drive is a remote drive), so what I suggest is split this task into multiple jobs by day/hour, and keep the logging to minimal/bulk, but don't write the log for each and every file/folder.

    Thank you.

  • #133772

    Participant
    Points: 130
    Helping Hand
    Rank: Participant

    try this:

    $FoledrsOlderThan = -4
    $CurrentDate = Get-Date -Format MM-dd-yyyy
    $FoldersToDelete = Get-ChildItem -Path X:\TargetFolder\ | Where-Object {$_.CreationTime -lt (Get-Date).AddMonths($FoledrsOlderThan)} | Select-Object -ExpandProperty Name
    
    $FolderList = @()
    foreach ($FolderToDelete in $FoldersToDelete) {
        $FolderList += "Folders and files deleted on $CurrentDate"
        $FolderList += Get-ChildItem -Path X:\TargetFolder\$FolderToDelete -Recurse | Select-Object Name, CreationTime | Sort-Object -Property CreationTime | Out-String
        Get-Item -Path X:\TargetFolder\$FolderToDelete | Remove-Item -Recurse -Force
    }
    $FolderList | Out-File X:\RemovedFoldersLog\$CurrentDate-$FolderToDelete.txt -Append
    

    This uses a single IO operation to write the entire log file at the end instead of numerous writes during deletion

    • #133787

      Participant
      Points: 211
      Helping Hand
      Rank: Participant

      Yes, this will limit the I/O throughput to an extent and faster execution.

  • #134021

    Participant
    Points: 37
    Rank: Member

    Guys thank you very much for your help. Unfortunately neither works they way you have purposed.
    I guess the problem is that in folders is too big amount of data to get and write into txt file. (I let finish counting files in one folder and at the end there over 1 million files... just one folder. And there is at this time about 10 folders like this). Just for info, this is local drive, I only changed letters and names for Forum purpose.

    I am going to log only high level folder structure.

    Thank you

    UPDATE: I let modified script run over night, I cannot say if was already stucked or still gathering data but powershell process was using about 3GB of memory.

You must be logged in to reply to this topic.