I'm having trouble with request-response logging on Vertex AI endpoints. Previously, with AI Platform endpoints, I was able to enable request-response logging via gcloud, which would then store the request-response data in a chosen BigQuery table. However, with the newer Vertex AI endpoints, the process requires using the REST API to enable request-response logging.
I followed the official documentation and made a PATCH request with the structure provided. Despite receiving a confirmation that the PATCH request was successful (as it displayed the updated configuration for the specified endpoint), the request-response logs are not appearing in BigQuery.
With the AI Platform endpoints, enabling logging resulted in a corresponding BigQuery insert job for every prediction request. However, with Vertex AI endpoints, no such job is created even after enabling the logging.
Here's the PATCH request I used:
curl -X PATCH \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: application/json" \ "https://$REGION-aiplatform.googleapis.com/v1/projects/$PROJECT_ID/locations/$REGION/endpoints/$ENDPOINT_ID" -d '{ "predict_request_response_logging_config": { "enabled": true, "sampling_rate": 1, "bigquery_destination": { "output_uri": "bq://PROJECT_ID.DATASET_NAME.TABLE_NAME" } } }'
I've also tried using gcloud ai and gcloud beta ai-platform versions update commands, but these do not recognize/find the endpoint. Furthermore, I have also created the endpoint with the above config using a POST request, which also did not work.
Good day @CarlduPlessis,
Welcome to Google Cloud Community!
If you have used --disable-container-logging
when you deployed your model, you can try redeploying your model but this time remove --disable-container-logging.
It might be possible that disabling container logging caused this issue. It is also possible that this is due to the schema, I suggest instead of adding it to a specific table, you can just specify PROJECT_ID.DATASET and the table will be automatically created namely request_response_logging. If you have specified just the project ID, your dataset and table will be automatically created, for more information you can check this link: https://cloud.google.com/vertex-ai/docs/predictions/online-prediction-logging#enable_disable_logs-dr... You can check the correct schema using the link you've provided: https://cloud.google.com/vertex-ai/docs/predictions/online-prediction-logging#enabling-and-disabling...I tried the following options above and it worked on my end. Try requesting a prediction after following the options above and preview the table to check if the requests are logged.
Hope this helps!
@kvandres I did some more digging and it seems the issue appears when using the rawPredict method on the endpoint. When I use the :Predict method on the endpoint, the logs seem to pull through, but when using the :rawPredict method, then the logs don't pull through. Below is the curl POST request to get predictions
curl -X POST -H "Authorization: Bearer $(gcloud auth print-access-token)" -H "Content-Type: application/json" https://europe-west4-aiplatform.googleapis.com/v1/projects/$PROJECT_ID/locations/$REGION/endpoints/"$VERSION":Predict -d "@${INPUT_DATA_FILE}"
and the raw predict
curl -X POST -H "Authorization: Bearer $(gcloud auth print-access-token)" -H "Content-Type: application/json" https://europe-west4-aiplatform.googleapis.com/v1/projects/$PROJECT_ID/locations/$REGION/endpoints/"$VERSION":rawPredict -d "@${INPUT_DATA_FILE}"
If you have any idea of how to get the logs working for the rawPredict method, that would help a ton
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |