This is my Equivalent command line for Dataproc serverless
gcloud dataproc batches submit --project xyz2022 --region asia-south1 pyspark --batch dummy-1 gs://bkt-1/abc.py --version 2.0.40 --container-image "asia-south1-docker.pkg.dev/xyz2022/v1:25" --jars gs://bkt-1/jars/abc42.2.24.jar,gs://bkt-1/jars/abc2.13-0.32.2.jar --py-files gs://bkt/demo.zip,gs://bkt-1/Config.properties --subnet composer-subnet --tags terraform-ssh --service-account batch-job-sa@rnd2022.iam.gserviceaccount.com --history-server-cluster projects/xyz2022/regions/asia-south1/clusters/cluster-4567 --properties spark.app.name=projects/xyz2022/locations/asia-south1/batches/batch-21fcd,spark.driver.cores=4,spark.driver.memory=9600m,spark.dynamicAllocation.executorAllocationRatio=0.3,spark.executor.cores=4,spark.executor.instances=2,spark.executor.memory=9600m,spark.eventLog.dir=gs://spark-dataproc-jobs/history/ --labels goog-dataproc-batch-id=dummy-1,goog-dataproc-batch-uuid=c217ec74-2a4m-40bc-bc18-b6b823a3341c,goog-dataproc-location=asia-south1 -- dummy-1
I want to create the same Job using Cloud Composer, I am using DataprocCreateBatchOperator.
Please provide me with airflow dag for the same job creation.