Hi @alessiasacchi,
There is no option available in dialog flow cx for directly use a fine-tuned LLM model (using own datasets) in dialog flow-cx agent. What are the other approaches to make it possible?
Hi!
to use a fine-tuned model, you will need to create a webhook and call that model from that webhook.
Best,
Xavi
I slightly disagree with this approach. It's technically doable but I don't see the advantages of calling an LLM without passing any context from the dialogue. Placeholders like $conversation, $last-user-utterance, $original-query and other built-in placeholders are not available outside of generators, generative fallaback, data store summarization prompts. Those placeholders are replaced with the appropriate values at runtime and the final text will be sent to the LLM. If you're calling a custom model via webhook in DF you are missing a lot of context.
these values cant be passed from the request? or fill it from predefined parameters?
Could you explain to me the details for creating a webhook and call the model? After I create a webhook, I always suffer from this error:
Webhook Error: HTTPSConnectionPool(host='8587947774488608768.us-central1-949136842740.prediction.vertexai.goog', port=443): Max retries exceeded with url: /v1/projects/utility-replica-451919-e3/locations/us-central1/endpoints/8587947774488608768:predict (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x3e9524142530>: Failed to resolve '8587947774488608768.us-central1-949136842740.prediction.vertexai.goog' ([Errno -2] Name or service not known)"))
I don't know how to solve it. Thanks.
At the moment when using a data store agent you can select a generative model between text-bison@001, text-bison@002, text-bison@001 tuned (conversational), text-bison@001 tuned (informational) and gemini-pro. You can aso provide your own prompt for the summarization LLM call. If you need to use a fine-tuned model you can develop Python code to call the Gemini / PaLM2 API for multi-turn conversations and use your fine-tuned model in Model Garden
I'm trying to achieve the below scenario.
I want to train the palm 2 text bison model with custom dataset(doing it by using finetune approach) and then want to get the data from model to dialog flow. What would be the best approach to connect dialog flow with model, as this fine tune model is not coming in generators?
Hi,
to use a fine-tuned model, you will need to create a webhook and call that model from that webhook.