I have a question about search page result limit on chronicle. Even I changed the time interval, I can just see 10K event result. Is there any suggestion to increase the limit? Or is there any limitation for search result per search
I dont think you can increase it. I assume its a hard limit on the backend to avoid sending too much data.
Out of interest, what is your use case?
I am collecting google platform logs with cloud storage and when I check logs with different time intervals I can see just 10K event. So, I am not sure how many logs I collected and last log source time is for instance 16.47, 17.46. I cannot see any log it arrives to chronicle on for instance 5.52, 5.59 pm/am etc.
At that point, I decided to ingest logs with native ingestion but chronicle add feed page doesn't show me this option. So, I cannot ingest google platform logs in real-time. If you advice to check something for that, ฤฑ will be appreciated.
View files in slack
Iโm not sure im totally understanding what ingestion options youโve tried. When you say youโre ingesting from cloud storage do you mean using the feed? Then, when you said using native ingestion did you mean the process described on
this page
?
Are you simply trying to make sure that all of your logs were successfully ingested into chronicle?
I want to aim ingest Google Platform logs in real-time.
To achive that, at the beginning, I used cloud storage method to collect google platform logs but I saw 2 problems. Firstly, logs are ingested 1 hour late by Chronicle. Secondly, listed log counts are 10K events even I changed the interval.
Because of 1 hour lately ingestion, I wanted to try Native Ingestion(I hope it will provide to collect logs in real-time). However, Native Ingestion doesn't be seen in my Chronicel--> Settings--> Feeds --> Add New --> Add Feed page on Chronicle.
Ah okay, gotcha. Chronicleโs Native GCP ingestion isnโt configured from the feed page. Youโll need to check out the link above and configure it from your GCP.
Youโve got two options:
1). GCP Integration
2). Manually send logs to the ingestion API
Ingestion API works great for us. Here are the docs:
https://cloud.google.com/chronicle/docs/reference/ingestion-api
For option 2, you can set up log syncs to write to pubsub and then use something like the following to write from pubsub to chronicle ingestion api:
https://github.com/chronicle/ingestion-scripts/blob/main/pubsub/main.py
When comparing time options, be concious that logs will show up in raw log search before UDM. Sometimes it will take awhile for the logs to be parsed into UDM. We havenโt looked at optimizing parsers for speed, but iโm sure thats a thread you can pull if you feel really really strongly about it
I am very newbie so I don't feel anything
Thanks for your response, I will observe it and search. Best wishes.
Feel free to drop me a DM if you get stuck
Thank you