When I read about loading data to a data warehouse, people usually mention saving the events in a cloud storage first then loading the events from the cloud storage to the data warehouse. However, in my app, I would like to directly send the events (JSON objects) from the website to the API server and then directly to the data warehouse.
Is that good practice with BigQuery? Note that the events are JSON objects that have a fixed schema.
Solved! Go to Solution.
Hi @codya-mubarak,
Welcome to Google Cloud Community!
In your case, directly sending events from your website to BigQuery through an API can be a faster and more cost-effective approach compared to using cloud storage as an intermediary. However, it's essential to consider things like error handling, security, and making sure you don't overload BigQuery with too many requests at once.
For more information, you can check out the official documentation for the BigQuery API and streaming inserts.
I hope the above information is helpful.
Hi @codya-mubarak,
Welcome to Google Cloud Community!
In your case, directly sending events from your website to BigQuery through an API can be a faster and more cost-effective approach compared to using cloud storage as an intermediary. However, it's essential to consider things like error handling, security, and making sure you don't overload BigQuery with too many requests at once.
For more information, you can check out the official documentation for the BigQuery API and streaming inserts.
I hope the above information is helpful.