Hi everyone,
I’m analyzing our GA4 data in BigQuery to understand daily user behavior. However, I’ve noticed that the costs are much higher when using BigQuery’s streaming option in GA4 (as opposed to a daily batch load), likely due to the additional streaming charges.
Our service generates around 50 million user events daily, which exceeds the daily batch load limit, so we had to use the streaming option. Our current cost breakdown for BigQuery is roughly streaming: loading: analyzing = 20: 10: 1.
Are there any best practices, or alternative methods, to reduce the cost of streaming and loading data into BigQuery? I couldn’t find specific documentation, so any advice would be appreciated!
Thank you!
ps. GPT translated.
Solved! Go to Solution.
Hi @Qorh,
Welcome to Google Cloud Community!
Streaming a huge volume of data to BigQuery can be costly. Here are some suggestions and best practices that may help reduce the cost of streaming and loading data in BigQuery:
I hope the above information is helpful.
Hi @Qorh,
Welcome to Google Cloud Community!
Streaming a huge volume of data to BigQuery can be costly. Here are some suggestions and best practices that may help reduce the cost of streaming and loading data in BigQuery:
I hope the above information is helpful.
Hi @marckevin
Thanks for your kind reply. I filtered events only BigQuery uses for my queries to get activated user counts. It reduces the cost to 1/10.
However, the problem was with the GA4 dashboard.
Since I changed BigQuery link to filter events, there were huge drops in ga4 dau aggregation like the screenshot.
I searched for this anomaly in the official documents but couldn't find any of it. Can this happen from BigQuery filtering triggered?
I just noticed it was not BigQuery problem. But then how can it happen? After Nov, 14 dashboard returned to normal
Are there any GA4 issues internally?