Hi Everyone, I am Emmanuel Katto. I'm building a Cloud Functions-based application that involves processing large files stored in Cloud Storage. However, I'm experiencing issues with performance and scalability when dealing with files that exceed 1GB in size. I'm using the Multer library to handle file uploads and storing them in Cloud Storage. My Cloud Function is written in Node.js and uses the @Google-cloud/storage library to interact with Cloud Storage. When processing large files (e.g. 2GB+), my Cloud Function times out or crashes, causing errors and retries.
Can anyone help me troubleshoot this issue?
Thanks in advance!
Emmanuel
Hello @emmanuelkatto ,Welcome on Google Cloud Community.
it's most probably related with OOM. See, when you dealing with processing either in Cloud Function or Cloud Run, in fact, files are processed by downloading them into memory. If you have container with 1GB MEM and you are processing files in max 1GB size, it';s fine. The problem is, when you want to process data larger that container MEM. AFAIR, Cloud Function limitation in terms of memory is 32GB ( and at least 8 vCPUs). Try to increase value of your MEM ( for testing you can choose value bigger than your file) and check how your CF will behave.
--
cheers,
DamianS
LinkedIn medium.com Cloudskillsboost