Hello!
I have a pipeline that runs predictions with the text-multilingual-embedding-002 model for a list of texts using batch prediction. It was working fine until last Friday (2024-07-26). Now I'm getting this error when I try to run it:
Failed to run inference job. Query error: Unsupported remote model endpoint text-multilingual-embedding-002@default
This happens when I try to use text-embedding-004 model as well (most recent models). If I run a single predictyion with get_embeddings method, It works fine with both models.
The BatchPrediction method only works when I change the model to textembedding-gecko@003.
What could be happening?