How to send log from Kinesis Stream to Google chronicle via Lambda Function with log segregation

Hi All,

we are trying to send logs from AWS Kinesis Stream to Google chronicle via Lambda Function.

that Kinesis stream(5 streams) contains logs from multiple sources (Guard duty, winevt etc)

we are trying to segregate the logs based on log types and send it to chronicle SIEM via webhook

can any one help us in this ????????(with supporting documents and links)

Solved Solved
0 5 599
1 ACCEPTED SOLUTION

Hi @hzmndt ,
We have logs from many sources coming into this stream. That's why we opted for using a lambda function. Since we can segregate the logs using an if.. else condition and send logs to Google Chronicle by using a webhook.

Thanks

View solution in original post

5 REPLIES 5

Hi Everyone,
Could anyone please help us here?
We are trying to pull logs from a Kinesis Stream and send it to Chronicle with a Lambda Function. The problem here is this stream has logs from multiple sources. We can use a webhook to send the logs to the specific log type.
Thank you in advance.

@mikewilusz @dnehoda @Rene_Figueroa @jstoner 

Lambda's are fairly flexible as they're just code snippets that run and you can make them do just about anything you want. They're similar to Cloud Functions on Google Cloud.

So given that, what you describe would be possible. It sounds like logs from many different sources are the input to the function, then in your code you'd interrogate each log line and use some type of matching to determine the log type to use for SecOps. A very basic psuedo-code would be something like:

log_line = <INBOUND LOG STREAM>
if 'oktaUserAgentExtended' in log_line:
  # Send to SecOps with "OKTA" log type
elif 'Microsoft-Windows-Security-Auditing' in log_line:
  # Send to SecOps with "WINEVTLOG" log type
else:
  # Send to SecOps with "UDM" log type or drop

 

Hi @mikewilusz  ,
Thank you for the revert
Do you know or have any sample lambda function to send logs to Chronicle from a Kinesis Stream or AWS S3 or SQS etc..? It would be really helpful if you can share that code.

Aravind Sreekumar

Based on the diagram>

https://github.com/aws-samples/decrypt-das-aws-rds/blob/main/Images/DAS-workflow.JPG

From Kinesis, you can configure AWS services such as Amazon Data Firehose and AWS Lambda to consume the stream and store the data.
Instead of using AWS Lambda, do you consider below?

So technically, we can get the database audit logs to Chronicle via below:
1. Amazon Data Firehose (supported by Chronicle via feed -> https://cloud.google.com/chronicle/docs/administration/feed-management)
2. AWS S3 bucket (supported by Chronicle via feed -> https://cloud.google.com/chronicle/docs/administration/feed-management)


Hi @hzmndt ,
We have logs from many sources coming into this stream. That's why we opted for using a lambda function. Since we can segregate the logs using an if.. else condition and send logs to Google Chronicle by using a webhook.

Thanks