Author Posts

August 6, 2013 at 1:22 am


I wonder if anyone can help me? I've got to write a script to monitor a file location for files containing error messages and send an email if an error message is found. I can do the email and search part but I've come up against a snag; I was going to run the script every 10 minutes or so (this is an arbitrary figure) but I only want to report the errors since the last time the script ran (so that I don't keep reporting the same old errors). The concept I was going to use was to write a date to a file (called something like lastrun.txt) and compare the modified date of the files I'm searching for the string in against the contents of lastrun.txt and only search the files that are 'newest'. Does the concept make sense? Is that the best way to do it?


Matt 🙂

August 6, 2013 at 3:39 am

That probably won't work because if the record is updated with a non-error then your modified date will still be newer than lastrun. Do the event records have a datetime? If so then you might want to read the records and if the records are newer than lastrun send the alert.

August 7, 2013 at 2:11 am

Hi Mark,

thanks for your post. Sorry, I probably didn't explain this properly; a user runs a process in an application and this creates a new file everytime the process is run (if there is an error it will state that in the file and if it's successful it will have a success message). Therefore, if they run it, for the first time ever, at 10 am a file (or several files) will be created that I can scan for the errors. If they run it again at 11:25 am the same will happen. What I want to do is, for example, scan the location at 10:30 then again at 11:30. Unfortunately I have no way of linking the running of the script to execution of the process in the application. I hope that I've explained it a bit better. I'm struggling to elaborate lucidly this week!


Matt 🙂

August 7, 2013 at 4:39 am

I think what Martin was trying to get across was that all the new files will be processed whether it has an error in or not. If you are ok with that and you have written your script to take that into account then all you are looking for is a way to only process the logs that have appeared since your script last run. There are a couple of ways to handle that.

You could add a simple line to move the processed log file into an archive/processed folder once your script has finished with it. That way the only files that would be in the folder would be the new ones when you next ran your script. This may not be an option if the users program expects to find the files in the original folder.

Creating the lastrun file would be a way around it, or creating anything that had a timestamp on it. You could consider an information item in the windows event log (Jeffrey Snovers idea) which would mean you could see whenever your script has been run through the event viewer. Something along the lines of:

Get-ChildItem c:\logs | Where-Object {$_.LastWriteTime -gt (gci C:\logs -Filter lastrun.txt).lastwritetime}

You would need to make your own decision as to which one you prefer. Depending on size, quantity, how often they are created, whether the software needs to see them because it appends to the file etc all make a difference to which is better.

August 7, 2013 at 6:39 am


thanks for clarifying! I do want to process all the new files and no, sadly, I can't move the processed files as they do need to remain in the default directory (as they need to be available to the application).

Thanks for your help, I like the idea of writing to the Windows event log so I think I'll go with that!

Thanks again,

Matt 🙂