Hi everyone, I'm working on a POC which executes python scripts that are stored in cloud storage. I first want to test on the cloud shell and then create a dataflow job for the same.
I tried several commands to execute as below but no luck. Can anyone assist in correcting the command?
--Cloud shell
$ python -m gs://gluemigration/scripts/Pyspark.py --region us-central1 --runner DataflowRunner --project temp-1 --usePublicIps=false --no_use_public_ips
--Dataflow job
$ gcloud dataflow jobs run run_custom_scripts --gcs-location=gs://gluemigration/scripts/Pyspark.py --disable-public-ips --max-workers=1 --region=us-central1 --runner=DataflowRunner --project=temp-1
I know I must be missing something a little somewhere. Any help would be greatly appreciate.
Thank you.