I think this is a bug, but maybe there's a workaround? Using the python client for vertex ai against the gemini 1.5 pro 002 model. It's been working fine until today when I added maxItems to my response schema. I kept getting the following error:
message': 'Request contains an invalid argument.', '_errors': (<AioRpcError of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "Request contains an invalid argument."
debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.197.95:443 {created_time:"2024-10-22T21:15:33.414391673+00:00", grpc_status:3, grpc_message:"Request contains an invalid argument."}"
>,), '_details': [], '_response': <AioRpcError of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "Request contains an invalid argument."
debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.197.95:443 {created_time:"2024-10-22T21:15:33.414391673+00:00", grpc_status:3, grpc_message:"Request contains an invalid argument."}"
>, '_error_info': None}
I tried many variations of the response schema to try to identify what was causing it, because sometimes it would work. The prompt wasn't related to the issue, so I just put "hello there" for the prompt. This will fail with the above error:
response_schema = {
"type": "object",
"properties": {
"prop1": {
"type": "array",
"maxItems": "20",
"items": {
"type": "object",
"properties": {
"prop2": { "type": "string" },
"date_context": {"type": "string"},
"canonical_label": { "type": "string" },
"confidence": { "type": "string" }
}
}
}
}
}
But by removing just one character, it would succeed. I don't know if there are other cases, but I found two. Removing the last t in date_context or removing the last l in canonical_label would result in it succeeding. So, this succeeds:
response_schema = {
"type": "object",
"properties": {
"prop1": {
"type": "array",
"maxItems": "20",
"items": {
"type": "object",
"properties": {
"prop2": { "type": "string" },
"date_contex": {"type": "string"},
"canonical_label": { "type": "string" },
"confidence": { "type": "string" }
}
}
}
}
}
which is very mysterious. Any ideas what's going on? And I wasn't sure about the exact requirements, so I tried this with maxItems, max_items, and the value as both string and numeric, but the behavior was the same.
Hi @jaojee,
Welcome to Google Cloud Community!
It seems you're facing a somewhat challenging issue with the schema definition while using the Vertex AI Python client with the Gemini 1.5 Pro model. The error message regarding an invalid argument can typically arise from several potential problems in the schema structure. Here are a few points to consider that may help clarify or resolve the issue:
1. Type of maxItems: The maxItems property should be defined as a numeric type rather than a string. Make sure it's set like this:
"maxItems": 20,
2. Schema Validation: Some models are sensitive to schema definitions, and there may be specific requirements for property names. Since you found that changing the property names (like date_context to date_contex) made a difference, it's worth double-checking if there are reserved keywords or specific naming conventions that the model expects.
3. Update and Compatibility: Ensure that your Python client library and any dependencies are up-to-date. Sometimes, issues arise from version mismatches or deprecated features.
In addition, you can check this documentation and release notes for any latest updates or new features related to Gemini.
I hope the above information is helpful.
I mentioned at the end that I had tried different formats for it with the same result.
I reported it to google and they agreed it seemed like a bug. Not sure there is a work around. For my specific case, I'm trying frequency penalty and was able to do what I needed to do. Hopefully this maxItems issue will be resolved soon.
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |