Im currently doing the google cloud skill boost course "Conversational AI on Vertex AI and Dialogflow CX". One of the videos states:
"You can add several generators in one fulfillment, which would be executed sequentially, one taking the output from the other as input. That way you can chain multiple LLMs, which might make it easier for debugging specific steps. One generator could provide information about destinations and the other could format or summarize that destination output to the user. That way you only have one concise response, but you have 2 separate prompts with clear instructions."
And the final answer, the one I print is $request.generative.fallback_location. I can see that the second generator isnt taking the first generated output as a parameter:
Both generators are giving me an output and the second generator's output should be the [AI] text:
But its just asking me what text do I want to extract. I wonder if I misunderstood someting or doing something wrong?
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |