Is it okay to have a "living" csv file on a share that constantly gets imported locally on various machines and then exported or is there a better way to have a central report when scripts are ran locally on the machine via SCCM package (keeping in mind that remoting is not an option)?
Basically just looking for the best way to collect data in a single report when the scripts run on each machine (possibly at the same time) via GPO or SCCM.
A file is never going to be a good idea for this; you're going to lose data at some point. You either need each computer to write to its own file (so there's no chance of some other system writing to it at the same time), or use a proper DB server.
Is using a db a normal thing for PowerShell reporting? We often run PowerShell scripts to gather various information and then basically want to know what the script did on that machine in a report. Not sure if I want to give every machine the ability to write to the db.
I typically do a hybrid approach using a queue based approach. A script gathers configuration details and writes a file to network share, we'll say XML named as Dave eluded (e.g. COMPUTER123_NetworkConfig.xml). I create another script using a scheduled task and that script will process each XML and you could generate a final PSObject and save it as an XML, save it a DB or whatever you wanted to do.
This approach means that I don't have to worry about clients writing to a database (any ODBC drivers, etc), they are simply creating a file on a network share. Any credentials to write to a database are stored on the server processing the files and\or running the scheduled task as the appropriate credentials keeps elevated permissions in a single place.