Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Import multiple large image files into cloud storage

I am looking for a way to import multiple 200mb+ image files into a folder of a storage bucket. My problem is that I don't control the source and have to deal with their exports which I will get via an API call. 

My go-to way would be a cloud function requesting the images and then storing them in a bucket. After the upload of all files is finished I would like to run Cloud Run to analyze said images for objects. The code is in Python and I don't want to deal with cloudflow or vertex unless there is no other way around it.

My concern is that the limitations of cloud functions when it comes to the maximum runtime of said upload which will have to happen over the internet. 

To sum it up:

Looking to either a way to parallelize the upload of multiple files from a single API or any other ideas that would help me stay under the one-hour limit of cloud functions. Buckets are used as multiple teams will be working with the same raw data and they are best familiar with Python. 

Thank you

3 2 1,337