This topic contains 2 replies, has 2 voices, and was last updated by
August 25, 2017 at 11:11 pm #78112ParticipantPoints: 0Rank: Member
I have been researching and testing an option to improve the time. In this scenario I´m working with Windows Event Forwarding (WEF) and for some groups of events the performance is not really fast as I expected. Given the fact I´m just trying to export the last hour of events, the time to complete the process is too long. I know that there are a lot events (thousands or more) to export, but I haven´t found a better way to reduce the time. The consumption is collecting data in the variable with the filter.
This is the part of the code with more time, memory and CPU consumption:
$xml = @'
*[System[TimeCreated[timediff(@SystemTime) <= 3900000]]] '@ Write-Host "Collect Events" -ForegroundColor Yellow $events = Get-WinEvent -FilterXml $xml | Select-Object ID, LevelDisplayName, LogName, MachineName, Message, ProviderName, RecordID, TaskDisplayName, TimeCreated I also have been testing this code exporting the data to a file instead of the variable, because memory consumption, but result is not faster as I want. Any additional way to improve event log data (thousand or millions of events) collection will be helpful Regards and thanks
August 26, 2017 at 8:19 pm #78127ParticipantPoints: 0Rank: Member
I was tasked with a similar thing as you are facing.
Is the server(s) being targeted on the same LAN as your source machine? For me the biggest performance killer was targeting machines across a WAN.
Also can you filter your events a bit more? Do you need all logs or just the app log for example? Do you need just errors or verbose information also?
I used Get-EventLog instead of Get-WinEvent but that's my preference.
Remoting could also be effecting you. Are you running it against 1 machine or multiple? If the answer is multiple then you might want to use Invoke-Command instead of a -ComputerName
After I gathered my relevant data I spat this into a SQL DB using the SQL cmdlets, this will make the data easier to query at a later date. Text files are notoriously bad for querying logs.
Anyway I could be talking a load of rubbish.
Let me know!
August 27, 2017 at 12:29 am #78129ParticipantPoints: 0Rank: Member
Thanks for your response Jack
Let me tell you that this scenario is being testing in one VM machine (32GB), I have WEF enabled (Windows Event Forwarding) in the server and I am using subcriptions to collect all Security events for my domain controllers. We are talking about thousands and millions of events. With PowerShell I am running some scripts to collect last hour of Security events to import in SQL sever (SQL Data base is in the same server), but for some Event Log Categories the amount of time collecting events is taking more than an hour. Filters that I am using is in the code.
The sample code that I pasted is this:
$xml = @'
*[System[TimeCreated[timediff(@SystemTime) <= 3900000]]] '@ $events = Get-WinEvent -FilterXml $xml | Select-Object ID, LevelDisplayName, LogName, MachineName, Message, ProviderName, RecordID, TaskDisplayName, TimeCreated As you can see this method is pretty good for small and medium amount of events. Memory consumption is a problem when the amount of events is bigger. For that reason I use a file instead of a varible, because variable uses memory. I already detected that the problem is related to event log data collection, after that everything Works perfectly. I hope this can make clear my situation. Regards and thanks
The topic ‘Export a big amount of data from event log by powershell’ is closed to new replies.