Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

How to load BigQuery stored procedure result to a spark dataframe

Is there any way that in Pyspark, we can call the BQ stored procedures and store their result to a data frame

0 1 465
1 REPLY 1

Hi @HarshaVardhan1

Welcome and we appreciate your interest to know more about Google Cloud Platform services.

You may explore this article - Work with stored procedures for Apache Spark as it can guide you to a resolution for your use case.

Hope this helps. 

Top Solution Authors