I am developing a Spring Boot microservice that is going to have only one use case.
This uses case involves choosing a random video file (.mp4) out of a total of 30 videos, (10 MB each, so 300 MB in total), and making some post processing to it.
I believe I have two alternatives here:
1. I use Cloud Storage to store all the videos and for each request I download one of them randomly and remove it once I am done with it.
2. Or I instruct my Dockerfile to copy the full directory containing all the video files inside the image so I don't have to deal with Cloud Storage and having to download any file from there for each request (which I am not sure how much it would take for 10 MB files, but it should be fast right?).
I don't know which alternative will end up saving me more money in the long run.
First alternative involves using an extra GCP product and also an extra step which might add more latency (let's say extra 500ms for the 10 MBs download(?))
On the other hand, with the second alternative, I will have an image with a bigger size (extra 300MB) which probably increases the cold start times (correct me if I am wrong here). Also, I believe that those 300 MB are not really going to be placed on disk but on memory, which might mean that my Cloud Run containers will be consuming more resources than required.
I'd love to hear your thoughts.
Thanks
Hi @xBurnsed,
Welcome to the Google Cloud Community!
Here are some factors to consider when choosing between using Cloud Storage or Cloud Run:
I would recommend using Cloud Storage to store video files as it is designed to do so. It is the cheaper option and It will have a low latency.
You can also get in touch with Google Cloud Support if you require more help.
Let me know if it helped, thanks!