Hi, I am using GCP Bigquery python module to load a pandas dataset to Bigquery table. Job is working when I put limit on the pandas dataset. But as the amount of row increases in dataset, load job gives and exception like this:
job = bigquery_client.load_table_from_dataframe(
File "/Users/candassahin/data/venv/lib/python3.9/site-packages/google/cloud/bigquery/client.py", line 2707, in load_table_from_dataframe
return self.load_table_from_file(
File "/Users/candassahin/data/venv/lib/python3.9/site-packages/google/cloud/bigquery/client.py", line 2460, in load_table_from_file
raise exceptions.from_http_response(exc.response)
google.api_core.exceptions.Forbidden: 403 PUT: The request is missing a valid API key.
I could not relate the amount of data and the "The request is missing a valid API key." Does load job have some limit or something different?
Thank you!