Hi, new in the forum. I'm starting to dive in gcloud and when launching a simple batch with a docker container, I'm having trouble with the shared memory. When I work in local, I can set this as an argument in the docker run, but with this script, it seems to not be working.
Solved! Go to Solution.
Thanks @nomi . I was using Streaming Image option which it seems to be in contradiction with more options that the ones in the docu (shm-size is one of them). Thanks for your time and your answer 🙂
Hello. I have tested in the options setting the shmsize as 512m. In the options, I have put something like "--shm-size 512m" and I can verify this through the docker command on the host. Do you have an example job spec for how did you set the shm-size.
Thanks @nomi . I was using Streaming Image option which it seems to be in contradiction with more options that the ones in the docu (shm-size is one of them). Thanks for your time and your answer 🙂