Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Datastream analysis bigquery

Hello, good morning,

 
I was searching for some information about Datastream and BigQuery and did not find anything about my doubt.
 
My doubt is: I created a streaming Datastream -> BigQuery and my base was 130GB. But when I saw BQ Analysis it told me that 29TB was processed. Why did this happen? In this project I do not have anything work with BQ.
marcelocastro_3-1680173841790.png
 
Thanks,
Solved Solved
1 4 494
1 ACCEPTED SOLUTION

Good morning, my problem is in staleness configuration, because BQ always do a query before insert. Thanks,

View solution in original post

4 REPLIES 4

Hi @marcelocastro 

Welcome back to Google Cloud Community.

Any configuration change leads to the creation of a new revision. Subsequent revisions will also automatically get this configuration setting unless you make explicit updates to change it.

For Cloud Run services, you can set memory limits using the Google Cloud console, the gcloud command line, or a YAML file when you create a new service or deploy a new revision

This reference might help you:
https://cloud.google.com/run/docs/configuring/memory-limits


Hi Aris, I don't understand how configuration changes on CloudRun relate to the question: 

"Why does Datastream usage causes excessive BigQuery analysis charges?"

Good morning, my problem is in staleness configuration, because BQ always do a query before insert. Thanks,

Hi @marcelocastro , thanks for the hint! I was having a similar issue and BigQuery analysis usage indeed went down as I increased the max_staleness settings.