Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Install Artifact Registry Python package from Dockerfile with Cloud Build

I have a python package located in my Artifact Registry repository.

My Dataflow Flex Template is packaged within a Docker image with the following command:

gcloud builds submit --tag $CONTAINER_IMAGE .

Since developers are constantly changing the source code of the pipeline, this command is often run from their computers to rebuild the image.

Here is my Dockerfile:

FROM gcr.io/dataflow-templates-base/python311-template-launcher-base

ARG WORKDIR=/template
RUN mkdir -p ${WORKDIR}
WORKDIR ${WORKDIR}

ENV PYTHONPATH ${WORKDIR}
ENV FLEX_TEMPLATE_PYTHON_SETUP_FILE="${WORKDIR}/setup.py"
ENV FLEX_TEMPLATE_PYTHON_PY_FILE="${WORKDIR}/main.py"

RUN pip install --no-cache-dir -U pip && \
    pip install --no-cache-dir -U keyrings.google-artifactregistry-auth

RUN pip install --no-cache-dir -U --index-url=https://europe-west9-python.pkg.dev/sample-project/python-repo/ mypackage

COPY . ${WORKDIR}/
    
ENTRYPOINT ["/opt/google/dataflow/python_template_launcher"]

I get the following error:

ERROR: No matching distribution found for mypackage
error building image: error building stage: failed to execute command: waiting for process to exit: exit status 1

I guess the Cloud Build process doesn't have the access rights. I'm a bit confused on how to get them from a Dockerfile.

(URL Removed by Staff) I found was mentionning the use of a Service Account key file read by the Docker process, but I would like to avoid that. Could I use the Service Account impersonation feature?

0 0 913
0 REPLIES 0