Help with building logging framework with stream manipulation

This topic contains 4 replies, has 2 voices, and was last updated by  The PowerShell Bear 6 months, 3 weeks ago.

  • Author
  • #65407

    The PowerShell Bear

    For the automated tasks that my team and I create, we have a logging mechanism that I've been working on improving that we use. Basically, it's a pair of advanced functions, one to set up a log, and the other to add entries into the log, which is a CSV file with three fields: status (Information, Warning, Fail, Verbose, and Debug), time of entry, and the details of the event. The script writer puts this Add-Log call with details, and, if it's not Informational, the status, and the function handles the rest.

    The more I look at improving our processes, trying to implement things like Pester unit tests (which naturally means moving more towards splitting functionality up into testable functions), the more I realize that this is not a scalable solution. If nothing else, if I add the Add-Log commands in the functions and run them without importing the commands and setting up the logs, they'll throw errors, which makes unit testing hard. Not having them there, on the other hand, makes logging progress that much harder, because then I have to perform gymnastics around stream manipulation and capturing that information on the back end to output to the log. Which then won't have the accurate timestamp, and likely won't line up the different streams correctly in the log.

    So, a lot of issues with doing it that way.

    What I was thinking of was figuring out a way to manipulate the streams directly and output each of the 5 main streams (Information, Warning, Error, Verbose, Debug) to something that will handle the log appropriately depending on the source stream. I could do this by using proxy functions to wrap and overload the 5 corresponding Write cmdlets, and make sure those proxied cmdlets were loaded in each of the scripts, then I could just use those Write cmdlets. That would work, but one thing that I would love for this to also do would be missed entirely by this: I would love for it to capture all stream outputs, not just those originating from the Write cmdlets I've overloaded.

    I've thought about writing a wrapper script to call other scripts with that would bake in the stream redirection, but that feels messy and fraught with issues.

    Does anyone have any thoughts on how best to proceed with this? I'm going to continue doing some digging, but so far I've not seen a good way to do what I'm looking at.

  • #65455

    Don Jones

    I dunno. In a Pester test I might mock Add-Log, actually, to remove the dependency.

    I'm not sure I'd get into mucking with proxy functions on top of the Write- commands, though. For one, your proxy commands are still a dependency, so you'd be back to square one in that regard. And I don't think using PowerShell to subvert PowerShell's own native design pattern is going to end in anything but tears.

    You have two options that stick with PowerShell's native patterns.

    One is to write a custom host application, and run your scripts in that instead of the native console. It's the host that decides what to do with the six pipelines, so you could certainly implement a host that redirected them to a configurable log of some kind. And if you write that as open-source, a lot of people are going to be your new friends, because a lot of people ask about this all the freaking time.

    Another is to stick with what you're doing. Yes, your scripts have dependencies – but most scripts do. It's fine. You can mock the dependencies in a Pester test (that's literally what Mock is for). Your scripts can easily implement some bootstrapping to install dependency modules from a local NuGet repo, too, using Install-Module.

    I personally question the scalability of CSV log files, but if that's working for your situation, go fer it. Obviously, the "native" Windows approach would be to establish and use an event log, and there are lots of tools out there for scraping those, querying them, etc. And, it's native functionality, so the necessary commands are core to PowerShell. You could also log to a database, and that's also core, native functionality that's always present. You might end up with a wee bit of copied-and-pasted logging code if you wanted to avoid external dependencies, but honestly I think learning to del with external script dependencies is probably a core competency these days.

  • #65484

    The PowerShell Bear

    So, the CSV logging is kind of legacy, and we're recognizing the limitations of it already, but it's what everything and everyone currently expects. I've been planning on expanding the logging function to log to both a CSV as well as to the event log, so that we can grow while still maintaining the way most people currently expect it to work.

    I'm still in the very early stages of learning Pester (literally, I've bought your book, and it's next on my list to read is how far I've gotten with it at this point), but I recognize learning it is going to be a necessary core competency going forward.

    I am intrigued by the idea of a replacement host program, though. Just out of curiosity...if you were thinking of doing that, would C# be a good language choice for writing it? I've just gotten off reading a book on C# and getting some basic proficiency with it, so I'm looking for good projects to help solidify my knowledge. I have no idea how I'd implement it, but everything starts somewhere.

  • #65487

    Don Jones

    C# would likely be the best language, as it's essentially trivial to host the shell from .NET. There's a lot more wiring-up to make a host behave like the console (e.g., the core engine, which is what you host, doesn't define all the default variables – the host does, because those preference variables control host behavior, not the engine behavior). But it's very do-able. Doug Finke, Keith Hill, those "dev" MVPs would be a resource.

  • #65488

    The PowerShell Bear

    I'll check it out!

    I have a new project to work on. Thank you!

You must be logged in to reply to this topic.