Batch prediction error - Internal server error 500

aiplatform.init(project=PROJECT_ID, location="us-east4")
my_model = aiplatform.Model(model_name,credentials=credentials)
 
batch_prediction_job = my_model.(
  job_display_name='my-batch-prediction-job',
  instances_format='csv',
  machine_type='n1-standard-4',
  gcs_source=['gs://test/batch_data.csv'],
  gcs_destination_prefix='gs://test/model_prediction',
 service_account=service_account
)
 
error:  _InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.INTERNAL details = "Unknown ModelSource source_type: MODEL_GARDEN model_garden_source { public_model_name: "publishers/google/models/mistral-7b"
...
InternalServerError: 500 Unknown ModelSource source_type: MODEL_GARDEN model_garden_source { public_model_name: "publishers/google/models/mistral-7b"
1 5 182
5 REPLIES 5

Facing the same issue with mistral model, were you able to sort it out?

Hi there, were you able to make progress on this issue? 

hi, I was not able to use the model from Model Garden (I'm suspecting its the naming convention) but uploading a custom image of the model allowed me to use the batchpredict. However, the responses were not as expected.

Ok, so you circumvented this issue by using custom containers for performing batch inference with your model. Good to know you found a way out. 
I feel like Vertex-AI batch prediction using foundational non Google models is not too well documented.  For the issue in this thread, I've raised a ticket: https://issuetracker.google.com/issues/337870533 with the G-team. It'll help if you could +1 on this, thanks!

Can't necessarily say the work around worked out as I was able to trigger the BatchPredictionJob with that approach but the job seemed to run forever despite only passing a jsonl file with just two prompts. 

You are right, the documentation doesn't seem to be fully in place. I tried the input format given here and a few variations but it has been more of a trial and error. 

Also, I'm unable to view the ticket for some reason.