Cloud function being used is built to push logs in GCP. Maximum request size is 156KB.
During load, the number of invocations increase up to 100/sec and memory usage also seems to be hitting 1GB. Many of the requests were rejected with below:
Function invocation was interrupted. Error: memory limit exceeded.
Currently 1 GB of memory is allocated to the Cloud function.
Note: No reading of secret or BQ is being done. Its just read data and push it to logs
Anyone else facing same issue? Increasing memory might reduce the error but to just send request to logs should not consume this high memory.
Solved! Go to Solution.
Thanks, we will raise a ticket
Hello. It's hard to determine from just looking at the posted facts, the reason your Cloud Function is running out of memory when you are just pushing logs. I would recommend you to contact GCP support as they would be able to see relevant information regarding your Cloud Function, and provide you with proper advice on how to proceed.
Thanks, we will raise a ticket