Author Posts

August 23, 2017 at 11:51 am

Hi there,

I have a PS script which does a great job of finding log files more than 7 days old and spitting them out to another disk as zip files. The only problem is it zips all of the log files older than 7 days as individual files rather than what I want, which is for it to find files older than 7 days and zip them in a single zip file with the parent folder name.

For example in C:\inetpub\logs\LogFiles I have the following folder structure:

C:\inetpub\logs\LogFiles\W3SVC1\logs.txt,logs2.txt etc
C:\inetpub\logs\LogFiles\W3SVC2\logs.txt,logs2.txt etc

Both contain lots of txt log files.

I want to look at these txt files, zip those older than 7 days, then output a single zip file for each W3SVC folder named the same as each W3SVC folder containing the log files older than 7 days from the original W3SVC folder.

This is the script I have so far but I'm not sure how to change the logic so that it does the above:

—–

#usage: new-zip c:\demo\myzip.zip
function New-Zip
{
param([string]$zipfilename)
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}

#usage: dir c:\demo\files\*.* -Recurse | add-Zip c:\demo\myzip.zip
function Add-Zip
{
param([string]$zipfilename)

if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}

$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)

foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}

$targetFolder = 'C:\inetpub\logs\Test\'
$destinationFolder = 'D:\TestLogZip\'
$now = Get-Date
$days = 7
$lastWrite = $now.AddDays(-$days)

Get-ChildItem $targetFolder -Recurse | Where-Object { $_ -is [System.IO.FileInfo] } | ForEach-Object {
If ($_.LastWriteTime -lt $lastWrite)
{
$_ | New-Zip $($destinationFolder + $_.BaseName + ".zip")
$_ | Add-Zip $($destinationFolder + $_.BaseName + ".zip")
}
}

—–

Any help you can give would be greatly appreciated 🙂

Thanks

Sam

August 24, 2017 at 8:01 am

There is a Compress-Archive cmdlet in newer versions of Powershell

Unfortunately it do not support folders in archive for individual files, so
I can suggest yoyu to move files which need to be archived in separate folder (with parent folder structure) and compress it into zip by any of available methods

The alternatives also can be direct .net usage of System.IO.Compression.ZipArchive
or Ionic.Zip.ZipFile (I prefer https://github.com/haf/DotNetZip.Semverd)

August 24, 2017 at 11:25 am

You can use .Net system.io.compression.filesystem to create a zip file from a directory, which keeps the structure intact.

#Copy all or move all of your logs to a temporary folder IE: c:\temp\ziptheselogs\

Add-Type -Assembly "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory("c:\temp\ziptheselogs","c:\archive\archivedlogs.zip")

#delete your temporary folder

August 24, 2017 at 11:45 am

Looking at this https://serverfault.com/questions/456095/zipping-only-files-using-powershell
and this https://stackoverflow.com/questions/17829785/delete-files-older-than-15-days-using-powershell

I came up with this. This should zip all files in a folder older than 7 days and then remove them.
$limit = (get-date).AddDays(-7)
$srcdir = "c:\test"
$zipFilename = "test.zip"
$zipFilepath = "c:\temp\"
$zipFile = "$zipFilepath$zipFilename"

#Prepare zip file
if(-not (test-path($zipFile))) {
set-content $zipFile ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipFile).IsReadOnly = $false
}

$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipFile)
$files = Get-ChildItem -Path $srcdir | where{! $_.PSIsContainer -and $_.CreationTime -lt $limit}

foreach($file in $files) {
$zipPackage.CopyHere($file.FullName)
#using this method, sometimes files can be 'skipped'
#this 'while' loop checks each file is added before moving to the next
while($zipPackage.Items().Item($file.name) -eq $null){
Start-sleep -seconds 1
}
}

Get-ChildItem -Path $srcdir -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

August 25, 2017 at 10:33 am

Thanks Curtis that works on a basic level but doesn't have the logic to find files older than 7 days or name the destination directory the name of the parent dir.

August 25, 2017 at 10:38 am

Thanks Simon that's so very close to what I need but it doesn't retain the parent folder name and create a zip with that name. I'm currently toying between your version and mine to see if I can cobble them together and use the parent folder name for the resulting zip filename. Much appreciated input!

August 25, 2017 at 10:41 am

Ah, no. The resulting zip file is empty when I use this:

$limit = (get-date).AddDays(-7)
$srcdir = "C:\inetpub\logs\Test\"
$zipFilename = "test.zip"
$zipFilepath = "D:\TestLogZip\"
$zipFile = "$zipFilepath$zipFilename"

#Prepare zip file
if(-not (test-path($zipFile))) {
set-content $zipFile ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipFile).IsReadOnly = $false
}

$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipFile)
$files = Get-ChildItem -Path $srcdir | where{! $_.PSIsContainer -and $_.CreationTime -lt $limit}

foreach($file in $files) {
$zipPackage.CopyHere($file.FullName)
#using this method, sometimes files can be 'skipped'
#this 'while' loop checks each file is added before moving to the next
while($zipPackage.Items().Item($file.name) -eq $null){
Start-sleep -seconds 1
}
}

Get-ChildItem -Path $srcdir -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

August 25, 2017 at 10:55 am

you have

$srcdir = "C:\inetpub\logs\Test\"

try

$srcdir = "C:\inetpub\logs\Test"

I will look at getting the foldername as the zip name

August 25, 2017 at 10:59 am

screenshot

Sadly that's not working either 🙁

August 25, 2017 at 11:04 am

ok try this. It should create a zip file with the source directory name and today's date as its name

$a = Get-Date -format "ddMMyyyy"
$limit = (get-date).AddDays(-7)
$srcdir = "c:\test"
$name=[System.IO.Path]::GetFileName($srcdir)
$zipFilename = $name + $a + ".zip"
$zipFilepath = "c:\temp\"
$zipFile = "$zipFilepath$zipFilename"

#Prepare zip file
if(-not (test-path($zipFile))) {
set-content $zipFile ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipFile).IsReadOnly = $false
}

$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipFile)
$files = Get-ChildItem -Path $srcdir | where{! $_.PSIsContainer -and $_.CreationTime -lt $limit}

foreach($file in $files) {
$zipPackage.CopyHere($file.FullName)
#using this method, sometimes files can be 'skipped'
#this 'while' loop checks each file is added before moving to the next
while($zipPackage.Items().Item($file.name) -eq $null){
Start-sleep -seconds 1
}
}

Get-ChildItem -Path $srcdir -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

August 25, 2017 at 11:16 am

Thanks, it still creates a corrupt zip unfortunately. I've tried adding/removing the slashes at the ends of both srcdir and zipFilepath but that doesn't seem to help 🙁

August 25, 2017 at 11:28 am

strange works fine on my machine. I have created and restored the resulting zip several times. Do you get any errors ?
Just in case it make any difference I am running on Windows 7 (I know.......)

PS M:\> $PSVersionTable.PSVersion

Major Minor Build Revision
—– —– —– ——–
5 0 10586 117

I end up with a zip files called test25082017.zip that I can open and restore all of the content

August 25, 2017 at 11:31 am

That's odd. I'm using this on Windows Server 2012:

PS C:\windows\system32> $PSVersionTable.PSVersion

Major Minor Build Revision
—– —– —– ——–
4 0 -1 -1

August 25, 2017 at 11:32 am

Just a thought try are using $srcdir = "C:\inetpub\logs\Test" do you have permission to that dir? Try moving the contents to a test or temp dir and pointing the script at that

Also I ran it from the ISE

August 25, 2017 at 11:35 am

Thanks, it was something I hadn't thought about but still get the same result despite moving srcdir to a dir I have full control over 🙁

August 25, 2017 at 11:46 am

MMM I am at a bit of a loss then as it works fine here 🙁 what exactly do you end up with ? i.e. an empty zip file ? Another idea is to comment out the delete part and change $limit = (get-date).AddDays(-7) to $limit = (get-date) so it sees today's files, as it maybe some strange time stamp thing on your files.

August 25, 2017 at 12:56 pm

The only way I can reproduce the error you get is if $files is empty i.e. there are not any files newer than 7 days in the directory. After I ran the script again in the window of the ISE I typed $files and it listed the files newer than 7 days and $files.cont told me how many.
PS M:\> $files

Directory: C:\test

Mode LastWriteTime Length Name
—- ————- —— —-
01/08/2017 15:12 434 01-12-2017results.csv
14/08/2017 15:40 664 14-40-2017results.csv
08/08/2017 13:15 194 1lines.csv
08/08/2017 12:53 194 2lines.csv
16/08/2017 08:29 441 basetext.txt
16/08/2017 08:32 1520 converted.xml
04/08/2017 09:59 10 decripted.txt
17/08/2017 09:43 4981040 help.txt
14/08/2017 15:40 387385 MSRC_CVEs2017-Jul.csv
14/08/2017 15:40 1618222 MSRC_CVEs2017-Jul.html
10/08/2017 12:15 66 name.csv
02/08/2017 14:39 16856 power.txt
04/08/2017 09:57 62 secure.txt
17/08/2017 12:33 10 servernames.txt
10/08/2017 12:16 20 test2.csv
14/08/2017 13:01 10782 testupdates.xlsx
04/08/2017 10:35 18 tocopy.txt
08/08/2017 13:25 214 withdate.csv

PS M:\> $files.count
18

PS M:\>

If I then run it again (without restoring the files I get this when testing $files
PS M:\> C:\Powershell Scripts\folercleanup.ps1

PS M:\> $files

PS M:\> $files.count
0

PS M:\>

Which results in an empty zip file like you have. Can you test that $files has something in it ?

August 25, 2017 at 1:09 pm

Screenshot

Yes it does contain files in the last 7 days.

This is weird, must be a difference between the PS versions and possibly OS?

Thanks again for your help!

August 25, 2017 at 1:18 pm

Maybe someone else here has an idea as I am out of them and don't have a 2012 server to test it on. Maybe you could try it on another machine / operating system or something. Anyway good luck and let me know how you get on 🙂

August 25, 2017 at 1:49 pm

Based on your original post, you already have that logic.

"I have a PS script which does a great job of finding log files more than 7 days old and spitting them out to another disk"

August 25, 2017 at 2:35 pm

Yeah it's naming the ZIP file with the name of the parent dir that's the issue.

My current script finds all the files in the folders older than 7 days and then spits them out as one zip file for each txt file, named the same as the txt file which is useless unless you can trace them back to the IIS worker process they came from (the name of the W3C folder is crucial for this in a multi-tenanted IIS environment) so this is the hurdle I'm currently facing.

August 29, 2017 at 12:09 pm

August 29, 2017 at 1:49 pm

Hi Sam,

How is this?


function New-LogZip {
    [CmdletBinding()]
    param (
        # Root path of logs
        [Parameter(Mandatory = $true)]
        [string]
        $Path,

        [Parameter(Mandatory = $true)]
        [string]
        $Destination
    )

    # Find logs over 7 days old
    $Logs = Get-ChildItem -Path $Path -File -Recurse | Where-Object {
        $_.LastWriteTimeUtc -lt [DateTime]::UtcNow.AddDays(-7)
    }
    # Cache parent and paths
    $ParentFolder = @{}
    $Logs | ForEach-Object {
        $ParentName = $_.Directory.Name
        Write-Verbose $ParentName
        if(-not $ParentFolder.ContainsKey($ParentName)){
            $ParentFolder[$ParentName] = @($_.FullName)
        }
        else {
            $ParentFolder[$ParentName] += $_.FullName
        }
    }

    $WorkingPath = Join-Path -Path $env:TEMP -ChildPath (New-Guid).Guid
    New-Item -Path $WorkingPath -ItemType Directory -ErrorAction Stop | Out-Null

    try {
        # Create temp location with parent
        $ParentFolder.GetEnumerator() | ForEach-Object {
            $NewPath = Join-Path -Path $WorkingPath -ChildPath $_.Key
            New-Item -Path $NewPath -ItemType Directory | Out-Null
            $_.Value | ForEach-Object {
                Copy-Item -Path $_ -Destination $NewPath
            }

            # Archive parentfolder
            $ZipLocation = Join-Path -Path $Destination -ChildPath "$($_.Key).zip"
            Compress-Archive -Path $NewPath -DestinationPath $ZipLocation -ErrorAction Stop
            Get-Item -Path $ZipLocation
        }

    } finally {
        Remove-Item -Path $WorkingPath -Recurse -Force
    }
}

August 29, 2017 at 2:10 pm

We have a winner (I think!)

Ran this against a small chunk of log files and it seemed to work.
Now I've aimed it at the entire LogFiles dir and it's chomping through them currently.

I'll let you know if it fails.

Thanks Max!