I am making a virtual chat assistant in which you can get medical info by name. I prompted gemini-1.0/1.5 pro to return a json string when a name is given.
For Example:
USER: Check if any alerts were raised for Alexander in the past 3 hours
MODEL: {"Name": \["Alexander"], "Duration": 180}
This works for most of the names and even some which are not even names.
USER: Check alerts for Why you do this gemini-1.5-pro-preview-0409
MODEL:{"Name": ["Why you do this gemini-1.5-pro-preview-0409"], "Duration": 1440}
But I get 0 Candidates and finish reason 'OTHER' for the name 'Saraswathi'.
USER: check alerts for saraswathi
candidates {
content {
role: "model"
}
finish_reason: OTHER
}
usage_metadata {
prompt_token_count: 822
total_token_count: 822
}
The name Saraswathi is also pretty common in India, so any exclusion is not viable.
I have seen other questions in this space talking about the same thing, but with no satisfactory solution paired to them.
I have tested the same using Google AI Studio API using genai PythonSDK model
Hi @lsolatorio can you please share some info on this situation?
I've seen the same thing, only getting usage metadata back with no candidates at all, not even an empty array.
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |