I want to run job through pyspark bigquery stored procedure, but the database that I want to access has access only through subnet. Is there any way to pass subnet to my stored procedure?
Hi @geekyGuy,
Welcome to Google Cloud Community!
You cannot directly pass a subnet through a BigQuery stored procedure. However, you can explore configuring a Virtual Private Cloud (VPC) to establish a network connectivity to reach the database by defining the subnet within the VPC and ensuring it is connected to your PySpark environment. You also need to ensure that the appropriate IAM roles and permissions are assigned to the service account used by PySpark job, so it has authorization to access BigQuery data and other necessary resources needed in your project.
I hope the above information is helpful.