Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

INVALID_ARGUMENT returned when using maxItems in response schema

 

 

I think this is a bug, but maybe there's a workaround? Using the python client for vertex ai against the gemini 1.5 pro 002 model. It's been working fine until today when I added maxItems to my response schema. I kept getting the following error:

 

 

message': 'Request contains an invalid argument.', '_errors': (<AioRpcError of RPC that terminated with:
	status = StatusCode.INVALID_ARGUMENT
	details = "Request contains an invalid argument."
	debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.197.95:443 {created_time:"2024-10-22T21:15:33.414391673+00:00", grpc_status:3, grpc_message:"Request contains an invalid argument."}"
>,), '_details': [], '_response': <AioRpcError of RPC that terminated with:
	status = StatusCode.INVALID_ARGUMENT
	details = "Request contains an invalid argument."
	debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.197.95:443 {created_time:"2024-10-22T21:15:33.414391673+00:00", grpc_status:3, grpc_message:"Request contains an invalid argument."}"
>, '_error_info': None}

 

 

I tried many variations of the response schema to try to identify what was causing it, because sometimes it would work. The prompt wasn't related to the issue, so I just put "hello there" for the prompt. This will fail with the above error:

 

response_schema = {
    "type": "object",
    "properties": {
        "prop1": {
            "type": "array",
            "maxItems": "20",
            "items": {
                "type": "object",
                "properties": {
                    "prop2": { "type": "string" },
                    "date_context": {"type": "string"},
                    "canonical_label": { "type": "string" },
                    "confidence": { "type": "string" }
                }
            }
        }
    }
}

 

But by removing just one character, it would succeed. I don't know if there are other cases, but I found two. Removing the last t in date_context or removing the last l in canonical_label would result in it succeeding. So, this succeeds:

 

 

response_schema = {
    "type": "object",
    "properties": {
        "prop1": {
            "type": "array",
            "maxItems": "20",
            "items": {
                "type": "object",
                "properties": {
                    "prop2": { "type": "string" },
                    "date_contex": {"type": "string"},
                    "canonical_label": { "type": "string" },
                    "confidence": { "type": "string" }
                }
            }
        }
    }
}

 

which is very mysterious. Any ideas what's going on? And I wasn't sure about the exact requirements, so I tried this with maxItems, max_items, and the value as both string and numeric, but the behavior was the same.

 

0 2 491
2 REPLIES 2