Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Custom container in vertex workbench

I noticed that you can use a custom container from the container registry when creating user-managed notebook, but I couldn't find any documentation on the required configuration/dockerfile specs for it to work with jupyterlab in a similar fashion to launcing a regular workbench environment (e.g. python 3). Should I open default jupyter lab port? anything else?

1 2 2,137
2 REPLIES 2

  1. Create the initial Dockerfile and run modification commands.
  To start, you create a Deep Learning Containers container using one of the available image types. Then use conda, pip, or Jupyter commands to modify the   container image for your needs, you can add extra packages when you create your custom container.

FROM gcr.io/deeplearning-platform-release/tf-gpu:latest
RUN pip install -y tensorflow 

   2.Build and push the container image.
   Build the container image, and then push it to somewhere that is accessible to your Compute Engine service account.

export PROJECT=$(gcloud config list project --format "value(core.project)")
docker build . -f Dockerfile.example -t "gcr.io/${PROJECT}/tf-custom:latest"
docker push "gcr.io/${PROJECT}/tf-custom:latest"

  1. Specify the container when launching the execution Custom container.
    customcontainer.png

     

 

Thanks. It'd be great if there were some clear instructions to create an image for a very common need (python > 3.7), where the current prebuilt containers are all 3.7 (including the image you gave in the example).