Hello Team,
Can someone assist me with pattern matching and parsing this type of log in Chronicle?
"version account-id interface-id srcaddr dstaddr dstport srcport protocol packets bytes end start action log-status vpc-id subnet-id instance-id tcp-flags pkt-srcaddr pkt-dstaddr region"
Hi,
In general I suggest you to use the official grok repo to search for a specific pattern: logstash/patterns/grok-patterns at v1.4.2 ยท elastic/logstash ยท GitHub
In this specific case, if all values are separated by space, you can use the following snippet to get all data (complete the snippet with the remaining values):
filter {
mutate {
replace => {
"version" => ""
"account_id" => ""
...
}
}
grok {
match => {
"message" => [
"(?P<version>\\S+) (?P<account_id>\\S+) ...."
]
}
on_error => "grok_extraction_failure"
overwrite => ["version", "account_id", ....]
}
}
In addition, I suggest you to check if every variable are correctly mapped to avoid error during the parsing.
Hi,
tried the above method, matched all the patterns as you mentioned still not able to parse the fields.
It shows start_time not found
"generic::internal: pipeline failed: filter date (4) failed: start_time not found"
please can you share the code used in the parser UI and a log example (anonymized it if necessary)?
this is the log
"version account-id interface-id srcaddr dstaddr dstport srcport protocol packets bytes end start action log-status vpc-id subnet-id instance-id tcp-flags pkt-srcaddr pkt-dstaddr region"
Its pretty hard to help if we need to completely recreate your setup from your words. That being said, have you defined start_time at the top? What references do you have to start_time after the grok? Are you maybe referencing start_time incorrectly? Have you used statedump to check if the grok statement worked as expected?
I have defined start_time at the top, The field Start_time is working perfectly fine for all the other logs except this type of log .
Are you comfortable sharing the parser in a DM?