what is the ideal detection logic to find large data transfer out of cloud sql instance ?
securty_results.description field provide the query ran but not enough to find the actual bytes sent or received
Solved! Go to Solution.
Also, if you go into the raw log, expand it out and click manage parser - you will be able to apply statedumps in you the parser config to understand potentially why it breaks.
Also, here https://cloud.google.com/sql/docs/mysql/logging - you can see in cloud logging whats going on as well.
How about looking into network data from that instance ?
With SQL query logging you might only be able to nail down queries that are expected to return a large no. of results
i tried to run query using pgadmin and saved as CSV and it was not logged
sql logs in cloud logging shows query but it shows as raw logs and not parsed using GCP_CLOUDSQL parser in chronicle siem
In that case you need to get parser fixed first. Raise a support case if you have access and share sample logs with them. Parser updates are fairly developed faster with Google support in my experience.
There's some filtering options for GCP.
--GCP Cloud Audit
OR log_id("cloudaudit.googleapis.com/activity")
OR log_id("cloudaudit.googleapis.com/system_event")
OR log_id("cloudaudit.googleapis.com/policy")
OR log_id("cloudaudit.googleapis.com/data_access")
OR log_id("cloudaudit.googleapis.com/access_transparency")
--GCP Cloud SQL
OR log_id("cloudsql.googleapis.com/mysql-general.log")
OR log_id("cloudsql.googleapis.com/mysql.err")
OR log_id("cloudsql.googleapis.com/postgres.log")
OR log_id("cloudsql.googleapis.com/sqlagent.out")
OR log_id("cloudsql.googleapis.com/sqlserver.err")
Also, if you go into the raw log, expand it out and click manage parser - you will be able to apply statedumps in you the parser config to understand potentially why it breaks.
Also, here https://cloud.google.com/sql/docs/mysql/logging - you can see in cloud logging whats going on as well.