The error message you're encountering indicates that you're trying to export a BigQuery table to Cloud Storage, but there's a nested schema field called "event_params" in the table, which is causing the issue. BigQuery doesn't support exporting tables with nested fields directly to Cloud Storage.
To resolve this issue, you can perform a flattening of the nested schema, so that you're working with a flat structure. You can use the FLATTEN function in a SQL query to do this before exporting the data. Here's an example of how to modify your code to flatten the table:
from google.cloud import bigqueryclient = bigquery.Client()
bucket_name = "your_bucket_name"
dataset_id = "your_dataset_id"
table_id = "your_table_id"
destination_uri = f"gs://{bucket_name}/shakespeare.json"# Build a query to flatten the nested schema
query = f"""
SELECT * FROM `{project}.{dataset_id}.{table_id}` """job_config = bigquery.QueryJobConfig(destination=destination_uri, write_disposition="WRITE_TRUNCATE")
query_job = client.query(query, location="US", job_config=job_config)
query_job.result() # Wait for the query job to complete print(f"Table data exported to {destination_uri}")
By using a query that flattens the nested schema, you should be able to export the data to Cloud Storage without encountering the "nested schema" error.
That approach won't work. According to the QueryJobConfig documentation , the destination should be set to a Table, TableReference or fully qualified table id.