Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Bigquery Load Job google.api_core.exceptions.Forbidden: 403 PUT Error

Hi, I am using GCP Bigquery python module to load a pandas dataset to Bigquery table. Job is working when I put limit on the pandas dataset. But as the amount of row increases in dataset, load job gives and exception like this:

 

job = bigquery_client.load_table_from_dataframe(
File "/Users/candassahin/data/venv/lib/python3.9/site-packages/google/cloud/bigquery/client.py", line 2707, in load_table_from_dataframe
return self.load_table_from_file(
File "/Users/candassahin/data/venv/lib/python3.9/site-packages/google/cloud/bigquery/client.py", line 2460, in load_table_from_file
raise exceptions.from_http_response(exc.response)
google.api_core.exceptions.Forbidden: 403 PUT: The request is missing a valid API key.

 

I could not relate the amount of data and the "The request is missing a valid API key." Does load job have some limit or something different? 
Thank you!

1 REPLY 1

It is possibly hitting a limit on one of your quotas, it may factor in if you are still using BQ sandbox (BigQuery that does not have billing enrolled) You might want to check that.  Either way I would suggest to create a case[1] for this to allow engineers to look into this and replicate, also they can confirm if you are hitting a Quota around bigQuery and request for an increase.  

[1]https://cloud.google.com/contact