gemini-1.0-pro-002: Randomly returns no content object

Moved from chat-bison to gemini-1.0-pro-002 and facing some issues

gemini-1.0-pro-002:generateContent responses are randomly lacking the content property, content is not there, response comes as:

 

{
  "candidates": [
    {
      "finishReason": "OTHER"
    }
  ],
  "usageMetadata": {
    "promptTokenCount": 5,
    "totalTokenCount": 5
  }
}

 

Is just random, same prompt, works sometimes only.

I am using system_instruction and temperature=0

5 REPLIES 5

Obviously, waiting for a Google/Gemini rep to step in and assist

As of today keeps happening randomly.

It's a final live product not performing, anyone seeing this?

In `gemini-1.0-pro-002`  a temperature of `0` unfortunately does not work properly, and it does not return always the same response. You may get different responses every time. See my post on this.
Your issue is likely because sometimes the LLM may block your or its response due to safety issues, which I find to be quite stringent. I would configure the safety attributes to avoid blocking any response. See this guide on safety attributes.

Yes I tried all HARMS as BLOCK_NONE but still...

Hello Google, (anyone there?)

Out of 10 exact calls 3 return an error. It is annoying and Google doesn't seem to care even though I have an open P2 ticket.

If Gemini wants to catchup to OpenAI they should start listening to us the early adopters before we fallback to OpenAI due to frustration