Cloud Task and Big Query

Hey everyone,

What's the ideal setting for queue tasks on Cloud Task and write a (very large) number of dataframes to Big Query?

I'm using the default but still getting a 403 Exceeded rate limits: too many table update operations for this table error, and also a 403 Quota exceeded: Your table exceeded quota for imports or query appends per table error.

 

I'm not sure what to do...

 

Thank you

0 1 388
1 REPLY 1

Hello,

The errors suggest you are hitting the following BigQuery limits: Table operations per day 1500 operations and probably also the maximum rate of table metadata update operations per table 5 operations per 10 seconds. 

To overcome these BigQuery limits it is recommended to use the BigQuery Storage Write API or the BigQuery Streaming API.

Please also have a look at integrating BigQuery stored procedure if your use case permits. Find here some examples.