Author Posts

February 9, 2016 at 5:17 am

I have a program that dumps a log file that I need to parse.

2016-02-08 18:58:33  INFO: [EDCAE484] id: 3, time: 2016-02-08 18:58:31, lat: 41.05376, lon: -74.28774, speed: 0.0, course: 0.0
2016-02-08 18:58:45  INFO: [A62ED546] connected
2016-02-08 18:58:45 DEBUG: [A62ED546: 5055  00.192.67.18] HEX: 485454502f312e3120323030204f4b0d0a0d0a
2016-02-08 18:58:45  INFO: [A62ED546] disconnected
2016-02-08 18:58:45  INFO: [A62ED546] id: 3, time: 2016-02-08 18:58:42, lat: 41.05376, lon: -74.28774, speed: 0.0, course: 0.0
2016-02-08 18:58:56  INFO: [93D08D5C] connected
2016-02-08 18:58:56 DEBUG: [93D08D5C: 5055  00.192.67.18] HEX: 485454502f312e3120323030204f4b0d0a0d0a
2016-02-08 18:58:56  INFO: [93D08D5C] disconnected
2016-02-08 18:58:56  INFO: [93D08D5C] id: 3, time: 2016-02-08 18:58:53, lat: 41.05376, lon: -74.28774, speed: 0.0, course: 0.0
2016-02-08 18:58:58  INFO: [0ABBA09E] connected
2016-02-08 18:58:58 DEBUG: [0ABBA09E: 5055  00.192.67.18] HEX: 485454502f312e3120323030204f4b0d0a0d0a
2016-02-08 18:58:58  INFO: [0ABBA09E] disconnected
2016-02-08 18:58:58  INFO: [0ABBA09E] id: 3, time: 2016-02-08 18:58:57, lat: 41.05376, lon: -74.28774, speed: 0.0, course: 0.0

I need the line with ID to update and dump to a text file with the Date Time Lat Long. I only need the last unique value.

Help??

February 9, 2016 at 6:11 am

What have you tried?

February 9, 2016 at 6:24 am

$t = Get-Content 'C:\Program Files (x86)\Traccar\logs\tracker-server.log' 
$t[0]

for instance. There are hundreds of lines in the text document.

The problem is I don't know how to tell powershell where to look, and how to grab the latest unique entry. I appreciate your time!

February 9, 2016 at 6:43 am

You might have to use regex or match a string in your script

February 9, 2016 at 7:08 am

gc .\log.txt | ? {$_ -match 'id:'} | select -Last 1

February 9, 2016 at 7:17 am

Dan Thanks alot, the only thing is there is multiple ID's so in the example I posted there is only ID 3 but there will be multiple ID's. ie ID 1 , ID 2

How do I grab those?

February 9, 2016 at 9:14 am

Get-Content may use large amounts of memory depending on number of lines and/or readcount use.

# Get Log
$log = Get-ChildItem -Path .\log.txt

# Parse log for matching ids
$col = {@()}.Invoke()
$log | foreach {$file = $_.OpenText()
while ($file.EndOfStream -eq $false){
$line = $file.ReadLine() ; If ($line -match 'id'){
$col.Add($line)}}}

# Get unique ids
$col2 = {@()}.Invoke()
$col | Select-Object -Unique | foreach {
$split = $_ -split ',' ; $id = $split[0] -match "id: (?'id'\d*$)"
$dump = [pscustomobject]@{
Id = $Matches.id
Date = ($split[1] -split '\s')[2]
Time = ($split[1] -split '\s')[3]
Lat = ($split[2] -split ':')[1]
Long = ($split[3] -split ':')[1]
} ; $col2.Add($dump) } # End Foreach

# Get last value for each id group
$col3 = {@()}.Invoke()
$col2 | Group-Object -Property id | Sort-Object -Property Name |
foreach {$last = $_.group | Select-Object -Last 1 ; $col3.Add($last)}
$col3 | Export-Csv -NoTypeInformation -Path .\last_unique_id.csv

February 9, 2016 at 9:21 am

Wow, Thank you soo much. I really appreciate it. Works great!

February 9, 2016 at 9:25 am

Excellent, glad I could help.

February 9, 2016 at 1:50 pm

Exception calling "OpenText" with "0" argument(s): "The process cannot access the file 'C:\Program Files (x86)\Traccar\logs\tracker-server.log' because it is being used 
by another process."
At C:\Users\jwall\Documents\JWallCreations\PullLog.ps1:6 char:17
+ $log | foreach {$file = $_.OpenText()
+                 ~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : IOException
 

Will only let it run once then I get this. argh! I need to be able to cycle through this log every minute, rather I would like too

February 9, 2016 at 1:59 pm

I should be able to force it, but -force doesn't make it work either. I can manually copy the file but powershell won't make a copy.

February 9, 2016 at 2:51 pm

The error stems from the fact that you never do a $file.Close(). You have to clean up after yourself before you try the next open.

February 9, 2016 at 6:20 pm

Just for fun, here's how I would do it.

$hash=@{}
Get-Content C:\temp\tracker-server.log | 
Select-String "] id: (\d+), " | 
ForEach-Object {
    $hash["$($_.Matches.Groups[1].Value)"] = $_.line
}
$hash.Values | ForEach-Object {
    $result = $_ -match "id:\s(?'id'\d+),\stime:\s(?'timestamp'\d+-\d+-\d+\s\d+:\d+:\d+),\slat:\s(?'lat'-*\d+.\d+),\slon:\s(?'lon'-*\d+.\d+)"
    [pscustomobject]@{
        Id = $Matches.id
        Timestamp = $Matches.timestamp
        Lat = $Matches.lat
        Long = $Matches.lon   
    }
} | Export-Csv C:\Temp\export.csv -NoTypeInformation

February 10, 2016 at 4:44 am

Thank you Curtis, This method doesn't hang the file open!

February 10, 2016 at 6:20 am

I was not able to reproduce your error. I was successful looping a single and multiple files. Also, while the loops were running, I was able to open the text file(s) manually. I guess I am overlooking something. I created loops using the following while loop and by piping an array (1..10). Also I changed $log variable to this “Get-ChildItem -Path . -Filter log*”.

 while ($true) {$count++; “Loop#: $count”; & “C:\path\to\pullLog.ps1”}

PSVersion 3.0
(.NET) CLRVersion 4.0.30319.34209

February 10, 2016 at 6:27 am

I will see, the file is being built by http://www.traccar.com. Its an open source project I am looking to query from.

February 10, 2016 at 12:38 pm

Hi Random Commandline,

I was impressed with you code, wondering if you could help me understand it.
If I do this :
$a = {@()}
Then look at the type name its,
System.Management.Automation.ScriptBlock

When I saw @() I assumed it would be an array.

Would you mind explaining the following code in a bit more detail so I can learn it ?

# Parse log for matching ids
$col = {@()}.Invoke() – What is this invoking ?
$log | foreach {$file = $_.OpenText()
while ($file.EndOfStream -eq $false){
$line = $file.ReadLine() ; If ($line -match 'id'){
$col.Add($line)}}} – How is this working ?

Thanks ! 🙂

February 10, 2016 at 1:44 pm

Ok, I understand how Curtis' $hash table works.
If “$hash['key'] = value” is used to add to a hashtable, it overwrites the key's value (if key exists). If “$hash.Add('key','value')” is used an error occurs (if key exists).

Exception calling "Add" with "2" argument(s):
"Item has already been added. Key in dictionary: 'key' Key being added: 'key'"
At line:1 char:1
+ $hash.Add('key','value')
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : ArgumentException

Graham, this is what I like to do if I am parsing log files. In the past, I have parsed IIS log files that were MegaBytes in size and 100,000+ lines long. Using Get-Content was too slow and used alot of system memory.

This is an example of how fast adding to collection is compared to array.

# Array and Collection Speed Example col = 210 milliseconds; array = 4.6 seconds
$arraytest = measure-command {$test = @(); 1..10000 | foreach {$test += $_}}
$coltest = measure-command {$test = {@()}.Invoke(); 1..10000 | foreach {$test.Add($_)}}
"Array Complete in {0}.{1} seconds" -f $arraytest.Seconds,$arraytest.Milliseconds
"Collection Complete in {0}.{1} seconds" -f $coltest.Seconds,$coltest.Milliseconds 
# Arrays have a fixed size and cannot be added to without using
# the += assignment operator. Using this operator creates a new array and overwrites the # previous.
# Converts array into a collection 
$col = {@()}.Invoke()

# Use .NET System.IO.StreamReader to open a file and read each line
# Add matching lines to collection
$log | foreach {$file = $_.OpenText()
while ($file.EndOfStream -eq $false){
$line = $file.ReadLine() ; If ($line -match 'id'){
$col.Add($line)}}} 

February 10, 2016 at 1:52 pm

Thanks random command-line. So {@()} is a collection ? Does that differ much from a hash table ?

February 10, 2016 at 6:24 pm

The advantage of the hash approach is that it is weeding out the unwanted records as it parses the log file. In the end there are only the 3 lines to parse into PSCustomObjects instead of all of the other records you don't really want anyway. You find the ones you want and just parse those.