Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Context size

I am using Gen AI Studio to create a language structured prompt.  Tried pasting in a blob of text and got error on submit. When trimmed the text, it was ok. What is the default maximum context size and is there a config to increase it? Thanks.

Solved Solved
1 2 5,172
1 ACCEPTED SOLUTION

Hi @AnilSomani

The maximum input and output tokens of Generative AI Studio depends on your chosen model, you can refer to this document for more information.

You can adjust the number of tokens in the prompt settings but please note that increasing this may result in poor model performance or predictions.

Here are some resources for your reference:

The PaLM API has a maximum input token limit of 8k and output token limit of 1k. If the input or output exceeds this limit our safety classifiers will not be applied, which could ultimately lead to poor model performance.

A token is approximately four characters. 100 tokens correspond to roughly 60-80 words.

Hope this helps.

View solution in original post

2 REPLIES 2

Hi @AnilSomani

The maximum input and output tokens of Generative AI Studio depends on your chosen model, you can refer to this document for more information.

You can adjust the number of tokens in the prompt settings but please note that increasing this may result in poor model performance or predictions.

Here are some resources for your reference:

The PaLM API has a maximum input token limit of 8k and output token limit of 1k. If the input or output exceeds this limit our safety classifiers will not be applied, which could ultimately lead to poor model performance.

A token is approximately four characters. 100 tokens correspond to roughly 60-80 words.

Hope this helps.

What about RPM?