Is it possible to assign a dataproc cluster (server or serverless) to the spark connection in the bigquery. I want to change the network details of the cluster that I am not able to change when using the pyspark stored procedure.
Hi @geekyGuy,
Welcome to Google Cloud Community!
Customizing the cluster configuration such as network settings when executing PySpark stored procedures in BigQuery is currently not supported.
Alternatively, you can submit a feature request regarding this. While I can’t provide a timeline for when this will be implemented, I recommend keeping an eye on the tracker and checking the release notes and documents for the latest updates.
I hope the above information is helpful.