Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

vertex ai Text-bison-32k

dear support,

I have a vertex AI project and I use it to predict answers, I gave the model a prompt yesterday, and it was generating the required number of tokens,Today I gave the model the same prompt, and it decreased this number of tokens to the quarters, it was the exactly the same prompt, Temperature is 0 and top p is 0. and top k is 40 while the max output tokens is equal to 8190. I want to know the reason of the shrinkage in the number of tokens generated although I didn't change anything related to the prediction parameters

0 1 1,388
1 REPLY 1