I have multiple pyspark jobs with dependencies. I want to orchestrate and schedule it. How it is possible to do this in Dataproc ( serverless ) batch jobs ?
Currently there is no way to schedule batch jobs using pyspark. There is a way to schedule jobs by creating a a Dataproc workflow template In Cloud Scheduler that runs a Spark PI job see this document.
You can File a Feature Request at Issue Tracker.