Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

How do you orchestrate and schedule Dataproc serverless batch jobs ?

RC1
Bronze 4
Bronze 4

I have multiple pyspark jobs with dependencies. I want to orchestrate and schedule it. How it is possible to do this in  Dataproc ( serverless ) batch jobs ?

0 1 1,472
1 REPLY 1

Currently there is no way to schedule batch jobs using pyspark. There is a way to schedule jobs by creating a a Dataproc workflow template In Cloud Scheduler that runs a Spark PI job see this document

You can File a Feature Request at Issue Tracker.