regarding using templates for cloud scheduler

Hello,We are doing a data pipeline related work in Google data flow but.. my complete code is in cloud shell as I build my pipeline in cloud shell for development purposes. But the issue is that pipeline contains multiple external dependencies. What I want to do now is schedule this cloud shell script by creating a template out of it. Since my project contains multiple dependencies I suspect I must use a container based template in my project.. I couldn't find a proper tutorial or documentation on this. I tried the way I found in the GCP documentation. And I got the template but the job is failing continuously. I require assistance in resolving This issue. I'll send you the error...docker: Error response from daemon: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "/opt/google/dataflow/python_template_launcher": stat /opt/google/dataflow/python_template_launcher: no such file or directory: unknown.cloudservice.service: Main process exited, code=exited, status=127/n/acloudservice.service: Failed with result 'exit-code'.

0 1 82
1 REPLY 1

Hello @isira123,

Welcome to the Google Cloud Community!

The error message "stat /opt/google/dataflow/python_template_launcher: no such file or directory" indicates that the Docker container cannot find the script that launches your Dataflow pipeline.

To resolve this, make sure you're using one of the official base images provided by Google for Flex Templates, as detailed here.

Additionally, I found a discussion on StackOverflow about the same issue you're encountering.

Here are some recommended solutions in the post:

1. The default ENTRYPOINT is ["/opt/google/dataflow/python_template_launcher"] and does not need to be explicitly set in your Dockerfile, provided you are using one of the base images mentioned above.

2. Make sure your Dockerfile includes the following parameters:

ENV FLEX_TEMPLATE_PYTHON_PY_FILE="${path}/beam_job.py"
ENV FLEX_TEMPLATE_PYTHON_REQUIREMENTS_FILE="${path}/requirements.txt"

3. Consider adding the following parameters to your Dockerfile and leave them empty as a precaution:

ENV FLEX_TEMPLATE_PYTHON_PY_OPTIONS=""
ENV FLEX_TEMPLATE_PYTHON_EXTRA_PACKAGES=""
ENV FLEX_TEMPLATE_PYTHON_SETUP_FILE=""

4. Avoid setting save_main_session=True in your pipelineOptions unless it's explicitly required.

5.Ensure that pipeline.run() is called in your code.