When attempting to export data from a BigQuery table to Cloud Storage, you encountered an issue. The error message indicates that the operation cannot be performed on a nested schema field event_params. How can this issue be resolved?
code:
destination_uri = "gs://{}/{}".format(bucket_name, "shakespeare.json")
dataset_ref = bigquery.DatasetReference(project, dataset_id)
table_ref = dataset_ref.table(table_id)
print(destination_uri, table_ref)
extract_job = client.extract_table(
table_ref,
destination_uri,
# Location must match that of the source table.
location="us",
) # API request
extract_job.result() # Waits for job to complete.
error message:
Traceback (most recent call last):
File "/home/tangbo508/python-files/test.py", line 25, in <module>
extract_job.result() # Waits for job to complete.
File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/job/base.py", line 922, in result
return super(_AsyncJob, self).result(timeout=timeout, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/google/api_core/future/polling.py", line 261, in result
raise self._exception
google.api_core.exceptions.BadRequest: 400 Operation cannot be performed on a nested schema. Field: event_params