I am trying to retrieve DDL objects from BigQuery but encountered a failure on line 18 with the code ddl_set = query_job.result(). I've assigned the necessary roles listed below to the service account.
However, I received the following error message:
google.api_core.exceptions.Forbidden: 403 Access Denied: Table bq:{my_dataset}.INFORMATION_SCHEMA.TABLES: User does not have permission to query table bq:{my_dataset}.INFORMATION_SCHEMA.TABLES, or perhaps it does not exist in location US.
def get_execute_ddls():
creds=service_account.Credentials.from_service_account_file("/Users/stekavade/Desktop/bq_creds.json")
client = bigquery.Client(credentials=creds)
query = """
select schema_name from INFORMATION_SCHEMA.SCHEMATA
"""
query_job = client.query(query)
rows = query_job.result()
for row in rows:
schema = row.schema_name
print("Gathering ddl for tables in schema {}".format(schema))
query = """
SELECT table_name,replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(ddl,'`',''),'INT64','INT'),'FLOAT64','FLOAT'),'BOOL','BOOLEAN'),'STRUCT','VARIANT'),'PARTITION BY','CLUSTER BY ('),';',');'),'CREATE TABLE ','CREATE TABLE if not exists '), "table INT,", '"table" INT,'),'_"table" INT,','_table INT,'),'ARRAY<STRING>','ARRAY'),'from','"from"'),'_"from"','_from'),'"from"_','from_'),'DATE(_PARTITIONTIME)','date(loaded_at)'),' OPTIONS(',', //'),'));',');'),'_at);','_at));'),'start ','"start" '),'_"start"','_start'),'order ','"order" '),'<',', //'),'_"order"','_order') as ddl
FROM `bq`.{}.INFORMATION_SCHEMA.TABLES where table_type='BASE TABLE'
"""
ddl_query = query.format(schema)
query_job = client.query(ddl_query)
ddl_set = query_job.result()
for row in ddl_set:
table_name = row.table_name
ddl = row.ddl
print("Running ddl for table {} in Snowflake".format(table_name))
use_schema = "use schema {}.{}".format("bq_db",schema)
with snowflake.connector.connect(
user='<snowflake_username>',
password='<snowflake_password>',
account='<snowflake_prod>',
warehouse='snwoflake_data_ingest',
database='bq_db',
role='bq_ingest_rl'
)as conn:
conn.cursor().execute(use_schema)
conn.cursor().execute(ddl)
print("Table {} created in bq_db.{} schema".format(table_name,schema))
Solved! Go to Solution.
What happens if you run the failing SQL statement in the BigQuery studio console? Can we parse the puzzle down to the exact statement that is failing and what it looks like? Maybe try run it using the "bq" command using the credentials file. Let's remove some of the unknowns like the Python code.
Hi @dr41,
Welcome and thank you for reaching out to our community.
I get that you are having challenges accessing your INFORMATION_SCHEMA and as the error message suggests, you may need to look into the specific permissions (aside from the roles) and location needed to access INFORMATION_SCHEMA.
To query the INFORMATION_SCHEMA.TABLES view, you need the following Identity and Access Management (IAM) permissions:
bigquery.tables.get
bigquery.tables.list
bigquery.routines.get
bigquery.routines.list
Note: You must specify the location to query the INFORMATION_SCHEMA views. Querying the INFORMATION_SCHEMA views fails with the following error if the location of the query execution doesn't match the location of the dataset or regional qualifier used:
Table myproject: region-us.INFORMATION_SCHEMA.[VIEW] not found in location US
I hope this will be of assistance to you
What happens if you run the failing SQL statement in the BigQuery studio console? Can we parse the puzzle down to the exact statement that is failing and what it looks like? Maybe try run it using the "bq" command using the credentials file. Let's remove some of the unknowns like the Python code.
@kolban The code is running without an issue after removing the "bq" command in line 14.