I have one use case. We are getting the http request, fetching the data from request and then writing each record in the databse (audit DB) for audit purpose with some additional fields like customer ID. Then we will read the data from Audit DB and will generate the report and will send to external system. we will also store reports on cloud storage for archival stage after reading from audit DB. I want to store the data (Audit records) in cloud storage directly instead of databse (Audit DB like cloud SQL, Bigtable). Is it right solution?
I think the main decider here will be if you need to query inside the data your store at all. Cloud Storage could be useful if you are just dumping a lot of records and then aggregating them up for some bulk reporting. If however you need to query the data stored in some way, that might be a challenge on Cloud Storage, as there is no easy way to search inside an object other than to download it, then search it. Or integrate with BigQuery as suggested in Aaron's answer below.
Another suggestion to consider for simple structured data like an audit log might be Firestore, this is a completely serverless document database that scales easily and would have the added advantage that you can query the stored records: https://cloud.google.com/firestore
Depending on how the log files are structured you could considered using BigQuery against the data in the bucket. Via the use of external tables it can work.
https://cloud.google.com/bigquery/docs/external-data-cloud-storage