Question on script that verify files integrity in a folder

This topic contains 6 replies, has 4 voices, and was last updated by  Syl 2 weeks, 1 day ago.

  • Author
    Posts
  • #97744

    Syl
    Participant

    Hi,

    For security purpose, I made 2 simple scripts that gets the filehash from all files in a given folder and store it in a XML file. The main line of the first script is:

    Get-Childitem -Path $FoldersPath -Recurse | Get-FileHash  -Algorithm SHA1 | Export-Clixml -Path "$pathtoXML\$productName.xml"

    It loads all files and files in subfolders from the supplied parent folder, create a SHA1 hashing of each and export the result in XML in a secured folder.

    The second script loads the xml and verify the hashing of every file in the selected folder. The main line being:

    Import-Clixml -Path "$pathtoXML\$productName.xml" | Where-Object { $_.hash -ne (Get-FileHash -Path $_.path -Algorithm $_.algorithm).Hash } | Get-Item

    With these 2 tools we can verify that no files were tampered with, but... it doesn't verify that no files were added or deleted in that folder. I'm trying to find a way to solve that with a simple solution but I'm a bit stuck. Should I do a hashing of the parent folder and put that in a separate xml, and then when we do the verification, check the parent folder hashing first and then verify single files hashing? Is there a better solution?

    Thanks

  • #97750

    Aurimas N.
    Participant

    Hi,
    How about checking the root folder LastWriteTime property, storing it somewhere and comparing the value on the next script run?

  • #97753

    Syl
    Participant

    It's an idea, but not enough in our case. We have to prove the files in there are the one we copied and hashed(one of Webtrust 2.0 requirements) An attribute like LastWriteTime is easy to fake.

  • #97764

    Joel Sallow
    Participant

    The file hash objects that you're exporting and importing store the file path of the file that was hashed as well under their .Path property.

    So put them in a list and compare the two lists with Compare-Object:

    $CurrentFileList = Get-ChildItem -Path $FoldersPath | Select-Object -ExpandProperty FullName
    $StoredFileList = Import-CliXml -Path (Join-Path $pathtoXML -ChildPath "$productName.xml") | Select-Object -ExpandProperty Path
    Compare-Object -ReferenceObject $StoredFileList -DifferenceObject $CurrentFileList
  • #97785

    Syl
    Participant

    Brilliant, thanks!

    • #97821

      Gnart
      Participant

      Don't know if it would help your cause. If you have a baseline number of files and folders to compare for each run, for a quick check of additions or deletions you could count the items during your process and then with the baseline or history.

       $NumberOf = get-childitem ...
      $NumberOf.count # has the count of files and folders from the get-childitem run
      
  • #97851

    Syl
    Participant

    Hi,

    That method would not be precise enough for our need since someone could fool such a check by deleting a valid file and creating a dummy to make sure the count is the same. The hashing would detect it only if the dummy file was created with the same filename.

    But Joel suggestion worked well. I adapted it a bit(added recursive and excluded folders since those are not in the hashing xml file) and it works perfectly 🙂

You must be logged in to reply to this topic.