Logging to one CSV from many separate scripts

Welcome Forums General PowerShell Q&A Logging to one CSV from many separate scripts

This topic contains 4 replies, has 3 voices, and was last updated by

 
Participant
1 month ago.

  • Author
    Posts
  • #178035

    Participant
    Topics: 1
    Replies: 2
    Points: 15
    Rank: Member

    I have a unique problem I'm trying to solve, but here goes.

    We have an initiative to move many of our web servers off domain.  I know, the fact they weren't set up in a DMZ is horrible, but here we are many years after they were initially set up, fixing the glitch.  I need a solution that moves files from off-domain shares to on-domain shares.  Before you ask "Why aren't you using a solution like Graylog", that's because our logs are not kept in the Event Viewer, so I needed a solution that was built to move files of any kind from a source to a destination.  We're utilizing service accounts with locked-down access to only the directories they are copying to/from, and the scheduled task runs as another service account as well.  All that being said, here is the solution:

    $UserName = 'ServiceAccount'
    $Password = Get-Content C:\Cred.txt | ConvertTo-SecureString
    $ServiceCred = New-Object System.Management.Automation.PSCredential($UserName, $Password)
    $SourceIP = 'x.x.x.x'
    $SourceShare = 'SourceShareName'
    $DestPath = 'C:\Logs\SourceShareName'
    
    New-PSDrive -Name $SourceShare -PSProvider "FileSystem" -Root \\$SourceIP\$SourceShare -Credential $ServiceCred
    $Files = Get-ChildItem -Path \\$SourceIP\$SourceShare\* | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-14)}
    Copy-Item $Files -Destination $DestPath -Recurse -Verbose
    
    $DaysBack = "-30"
    $CurrentDate = Get-Date
    $DateToDelete = $CurrentDate.AddDays($Daysback)
    
    Get-ChildItem -Path \\$SourceIP\$SourceShare\* | Where-Object {$_.LastWriteTime -lt $DateToDelete } | Remove-Item -Recurse -Verbose

    This runs as a scheduled task on a dedicated log collection server and runs every minute.  The solution works fine, but I'm trying to add a specific kind of logging.  I need a CSV file that writes a line each time the script runs.  The CSV needs to append the log with what is being copied, and a timestamp, so that when a dev looks at it they know exactly when the last time the script ran was.  Also, it's important to log not only that the script ran but what was copied so that the devs have a solid source of truth and can trust the data.  The scheduled task might be successful, but I need a log file with proof of what ran, when, and what was copied.  I wouldn't be against a separate log for each script, but I'm not sure which direction to go in with this.

    I've considered a few different options, but I wanted to ask here first to see what the most logical solution might be.

  • #178059

    Participant
    Topics: 1
    Replies: 1552
    Points: 2,700
    Helping Hand
    Rank: Community Hero

    I've considered a few different options, but I wanted to ask here first to see what the most logical solution might be.

    You could give us some options to choose from if you already have different options available.

    I'd think the option are limited with the requirement you gave. 😉 (Export-CSV -Append)

    • #178065

      Participant
      Topics: 1
      Replies: 2
      Points: 15
      Rank: Member

      Sorry, wrote this in a hurry this morning.

      Export-CSV -Append will work, but I guess my question is more around whether this is the best path forward.  We will have, potentially, over 100 tasks running every minute to cull these logs.  With that many scripts trying to write to the same file, I'm wondering whether lines will get clobbered by other processes accessing the log file or not.  Because of this, I'm wondering if my path should be an individual file for each script, or one file for all these scripts to write to.

  • #178071

    Participant
    Topics: 8
    Replies: 1213
    Points: 756
    Helping Hand
    Rank: Major Contributor

    There are multiple ways to handle log roll ups. What you do not want is multiple threads trying to manipulate a file, that is what database engines are for. It would depend on how important the logging is, but there are a couple ways to do it.

    • Log locally on the individual servers, I would log as XML, so that you can do date queries without having to do conversions (CSV is all strings). The script can append to the XML, so locally, you would still maintain a rolling log. To consolidate, you can schedule a task on an server (that can reach each antecedent server) and loop through each server to connect to a C$ or share you setup on each server to collect all the data and append to a parent CSV.
    • Log remotely to a share on a parent server. You'll want to create unique names COMPUTER_DATESTRING.log and then have a scheduled task append all CSV files into a single CSV.
  • #178074

    Participant
    Topics: 1
    Replies: 2
    Points: 15
    Rank: Member

    I've decided on a solution, based on some of the feedback given here.  I'm going to move forward with a separate log in each share location to mitigate issues with accessing one file from hundreds of scripts at the same time.  Thank you all for the input!

You must be logged in to reply to this topic.