Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Dialogflow CX - Agent Settings Generative AI

Hi, 

Ask - I am trying to figure out why "Chat Bison" is not an available option in the generative model selection dropdown, within Agent Settings -> ML -> Generative AI -> Generative Model Selection?

I am only able to select Text Bison.

Thank you in advance.

1 9 2,550
9 REPLIES 9

hi @dmalikian01 !!

LLMs in Dialogflow are used to just simply generate responses, either using generative fallback, generators or data stores. 

 
 
The text-bison foundation model is optimized for a variety of natural language tasks such as sentiment analysis, entity extraction, and content creation. The types of content that the text-bison model can create include document summaries, answers to questions, and labels that classify content.
 
whereas using chat-bison you will bild conversational apps from scratch, like Bard or ChatGPT.
 
Here in Dialogflow CX, the conversation itself is manged by the Dialoglfow CX engine, so this is why you do not need chat-bison.
 
Best,
Xavi
 

Hello @dmalikian01 

Great question!

No, it is intentionally not used since chat-bison is essentially a wrapper for text-bison that implements multi-turn conversations, sessions, and conversation history. And as Xavier earlier in this thread said, Dialogflow CX is a state machine that handles multi-turn conversations, sessions, and conversation history. So I think the universe might implode if one were to use Dialogflow CX with chat-bison. 😄 In all seriousness though, the idea of the generative features in Dialogflow CX / Vertex AI Conversation is that Dialogflow CX is really good at handling stateful conversations, history, and sessions, and we can call out to text-bison (or code-bison or other LLMs) while managing all of the state in Dialogflow CX. For example, most of the calls that we do from Generative AI Agent and Generative Fallbacks include the conversation history from Dialogflow CX as part of the prompt to the LLM, and in Generators you can send the conversation history in the prompt by using $conversation and $last-user-utterance built in placeholders (in addition to your own session variables to make the prompt more contextual). In reality, tracking multi-turn conversations with intertwined calls to LLMs is a complicated dance, and in most Vertex AI Conversation use cases it works great, in some edge cases it breaks, and in other specialized cases users might fall back (pun intended) to using chat-bison directly in their apps instead of Dialogflow CX.
 
Hope that helps!
Alessia
 

Is it possible to directly use a fine-tuned LLM model (using own datasets) in dialogflow-cx agent ?

Hi, fine tuned / custom models are not currently supported by any of the Generative AI features in Dialogflow CX.   

Hi Alessia, 

Are there plans to make Gemini pro available in Dialogflow CX?

Hey Stephen, Not sure what the roadmap looks like for Gemini in DF. I have asked the PMs. Will let you know once I find out. 

Thanks, will wait for the update

Regards,
Stephen
Data Architect - Data & Analytics
[cid:image001.png@01DA336F.6C595070]

I see it there to select, it is in: Agent Settings > Generative AI then scroll to the bottom of the screen and a drop down to select generative model.  

Hello Stephen, 

Currently I don't have specific information regarding whether/when new models will be available for the recent Generative AI features (Generators, Generative fallback, etc). Rest assured we'll continue to bring the best performance models available to the product.