Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Trying to use the chat history as context

 

const textModel = 'gemini-1.5-pro-preview-0409';

function initVertex() {
    const vertex_ai = new VertexAI({ project: project, location: location });

    return vertex_ai.preview.getGenerativeModel({
        model: textModel,
        safetySettings: [
            { category: HarmCategory.HARM_CATEGORY_HATE_SPEECH, threshold: HarmBlockThreshold.BLOCK_NONE },
            { category: HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT, threshold: HarmBlockThreshold.BLOCK_NONE },
            { category: HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT, threshold: HarmBlockThreshold.BLOCK_NONE },
            { category: HarmCategory.HARM_CATEGORY_HARASSMENT, threshold: HarmBlockThreshold.BLOCK_NONE }
        ],
    });
}

export async function startChat(question: string) {
    const model = initVertex();
    const chat = model.startChat();
    const result = await chat.sendMessage(question);
    return result.response.candidates?.at(0)?.content.parts.at(0)?.text;
}

 

Hi,

I'm using Vertex AI for a project of mine. Basically, I have a Google Cloud Function to get the generated text from the model. The problem is each time the function is called a new chat starts and the history of the previous chat is gone. Is there a way for me to use chat history as context or any other way for me use the chat history. I do save the chat history in document currently.

Thank you

6 2 7,508
2 REPLIES 2

I faced the same issue, and I found the Vertex Documentation to be lacking. However, I came up with a workaround for this. I saved the chat messages in a database (you can use a json file or sqlite on Cloud Storage to save money) with columns for user_id, user_question, and bot_response. Then, each time the cloud function runs, it retrieves the previous messages from that user and generates a chat history for the model in the format required by Vertex.

from vertexai.generative_models import GenerativeModel, ChatSession, Content, Part

def __generate_llm_history(self, messages):
        """
        Generate the LLM history
        :param messages: The messages from the DB
        :return: The history
        """
        history = []
        for message in messages:
            history.append(Content(role="user", parts=[Part.from_text(message["user_question"])]))
            history.append(Content(role="model", 
                        parts=[
                            Part.from_text(f'{message["bot_response"]}')
                        ]
                )
            )
        return history

Then, you can pass the history to create the session.

chat_session = ChatSession(model=model, history=history)
response = chat_session.send_message(prompt, generation_config=parameters)

The good thing about this is that you can control the history window, retrieving just the last n messages. Also, you can clear the history if the user wants to.

Thank you for this. I will try this out