I have a cloud function which calls the google vision api. I think because of this, it's requiring > 512mb of memory. I tried writing it as a gen1 function which does allow me to have >512mb of memory and it works fine. However, as a gen2 function, when I try to increase the memory to say 1gb and deploy, the deployment spinner never stops spinning.
If the max memory in gen2 is 512mb, perhaps the web ui should only show up to that amount of memory.
Is there some way to get around this memory restriction? Is this perhaps something that can't be done in cloud functions gen2?
I figured if there's a gen2 that I should be using it over gen1 as perhaps gen1 will eventually be deprecated, so it's surprising this didn't work in gen2.
Solved! Go to Solution.
I found a way to get around this. On the Cloud Functions panel, on the line for my function, at the far right side, I clicked the ... menu and went to the corresponding Cloud Run service for this function. There, I saw a warning that it was complaining that it was allocating <.5 CPUs to my function. I raised it to 1 vCPU, hit the redeploy button there, and presto, the gen2 function with 1gb memory deployed.
This seems like a bug to me that the Cloud Functions side didn't manage that for me, or even give me a clue as to what was wrong.
I found a way to get around this. On the Cloud Functions panel, on the line for my function, at the far right side, I clicked the ... menu and went to the corresponding Cloud Run service for this function. There, I saw a warning that it was complaining that it was allocating <.5 CPUs to my function. I raised it to 1 vCPU, hit the redeploy button there, and presto, the gen2 function with 1gb memory deployed.
This seems like a bug to me that the Cloud Functions side didn't manage that for me, or even give me a clue as to what was wrong.