Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Assigning dataproc cluster to pyspark bigquery stored procedure

Is it possible to assign a dataproc cluster (server or serverless) to the spark connection in the bigquery. I want to change the network details of the cluster that I am not able to change when using the pyspark stored procedure.

0 1 76
1 REPLY 1

Hi @geekyGuy,

Welcome to Google Cloud Community!

Customizing the cluster configuration such as network settings when executing PySpark stored procedures in BigQuery is currently not supported.

Alternatively, you can submit a feature request regarding this. While I can’t provide a timeline for when this will be implemented, I recommend keeping an eye on the tracker and checking the release notes and documents for the latest updates.

I hope the above information is helpful.