Hi,
I have deployed a custom model (from a Docker image) in a Vertex AI endpoint.
When I try to get a prediction in Java with the following code:
private PredictResponse predict(String endpointId, String query, String project, String location) throws IOException {
try (PredictionServiceClient serviceClient = getPredictionServiceClient()) {
EndpointName endpointName = EndpointName.of(project, location, endpointId);
ListValue.Builder listValue = ListValue.newBuilder();
JsonFormat.parser().merge(query, listValue);
List<Value> instanceList = listValue.getValuesList();
PredictRequest request = PredictRequest.newBuilder()
.setEndpoint(endpointName.toString())
.addAllInstances(instanceList)
.build();
return serviceClient.predict(request);
}
}
I got a io.grpc.StatusRuntimeException: INTERNAL: RST_STREAM closed stream. HTTP/2 error code: INTERNAL_ERROR.
I see no error when I check the logs in the Google Cloud Console. It looks like the request has been processed without error.
The strange part is that everything is working if:
The request in CLI works so it seems that the model+endpoint is working.
The request in Java to another endpoint is working, so the code seems correct.
Any idea how to debug/fix this?
Solved! Go to Solution.
I have found an open issue on this topic: https://issuetracker.google.com/issues/234474507
I have posted a workaround that also solves the bug in the Java SDK.
Update: if I use the CLI to send the request, there is also an error.
gcloud ai endpoints predict $ENDPOINT_ID --json-request=input.json --region=europe-west4
Returns
Using endpoint [https://europe-west4-prediction-aiplatform.googleapis.com/]
ERROR: gcloud crashed (ContentDecodingError): ('Received response with content-encoding: gzip, gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect header check'))
I have found an open issue on this topic: https://issuetracker.google.com/issues/234474507
I have posted a workaround that also solves the bug in the Java SDK.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |