Hello community, I have an issue with using the Vertex AI text-bison and chat-bison models. I tried asking a question from a document using the Vertex AI API. With the text-bison model, I'm not receiving any response. With the chat-bison model, I'm receiving the following error response: 'I'm not able to help with that, as I'm only a language model. If you believe this is an error, please send us your feedback.' Can you help me fix this?
Hi @KannanG03,
Thank you for reaching out to the community.
The response message ("I'm not able to help with that, as I'm only a language model") that you are receiving is called a Fallback response, this is usually triggered by a poor-quality prompt, a language or location which is not yet supported or your inquiry is triggering a safety filter.
If the model responds to a request with a scripted response like "I'm not able to help with that, as I'm only a language model," it means that either the input or the output is triggering a safety filter. If you feel that a safety filter is being inappropriately triggered, please click Report Inappropriate Responses on the Generative AI Studio Overview page in the Google Cloud console to report the issue.
Here are some useful resources for your reference:
I also found some interesting posts from our community and Stackoverflow that is relevant to your concern.
Hope this helps.
Hello,
It seems like you're encountering some issues with using the Vertex AI text-bison and chat-bison models. Let's troubleshoot the problems you're facing:
1. **No Response from text-bison Model:**
- Double-check that your API request is properly formatted, including the input text and any required parameters.
- Ensure that your API token and authentication are set up correctly.
- Review the documentation and examples provided by Vertex AI to make sure you're following the correct usage.
2. **Error Response from chat-bison Model:**
- The error message suggests that your input might be outside the model's scope. Check if you're providing valid input and following the appropriate conversational context format.
- Make sure you're using the right API endpoint and method for chat-based interactions.
- Confirm that the chat-bison model supports the specific task you're trying to accomplish.
If you've reviewed these points and are still encountering issues, consider the following steps:
1. **Check Model Versions:** Ensure you're using the latest version of the models and the corresponding API endpoints.
2. **API Documentation:** Thoroughly review the official Vertex AI API documentation for the models you're using. It should provide detailed instructions on how to structure requests, handle responses, and troubleshoot common issues.
3. **Contact Support:** If the problems persist, reach out to the Vertex AI support team or community forums for assistance. They can provide specific guidance based on your use case.
Adjusting your approach as per the documentation and guidance should help resolve the issues you're facing. hope it helps
I have written a series of articles for "How to implement your own bot using `chat-bison` model", You might want to give it a look.
1. How to Create Chatbot Using PaLM 2 Model (chat-bison@001)
2. How to Fine-tune the chatbot(chat-bison@001)
3. Google Cloud: Vertex AI — Get Familiar with Terminologies
Let me know if you have any other doubt, I would be happy to help.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |