Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

A code sample in this response was truncated because it exceeded the maximum allowable output

I've been using Gemini code assist for a few weeks now but in the last 4 or maybe 5 days i getting this error while trying to use the ai 
i tried everything , reducing the codes , changing account , trying different prompts , reinstalling Gemini code assist / VScode (clearing cashes as well ) , downgrading versions
but this error just doesn't go away
and i am pretty sure that my project is not a really big one to cause this error I've used Gemini code assist on bigger more complex projects and nothing was wrong with it until a few days ago
i was wondering if i am doing something wrong or is this a general bug ? is anyone else having this problem or is it just me ? is there a fix to it? 
and no don't want to use agent mode i want a regular Gemini code assist experience a bug free one ofc 🙂

6 REPLIES 6

Have the same problem, it worked well for a day or two, and now it is routinely throwing this error, even when the code block is not large and just 3-4 lines.

Same here. I'm getting this message constantly when I try to ask something with 2 file attachments.

Hi @ariaebrahimpoor,

Welcome to Google Cloud Community!

This issue is actually listed in the known issues for Gemini Code Assist. The chat responses can get cut off when they involve an updated version of a large open file.

As a workaround, try selecting a smaller section of your code and include a clear instruction in your prompt like “only output the selected code.” That usually helps avoid the truncation.

Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.

 

This seems to be a bug related to the VS Code integration. I saw the truncation error after it had written out 90% of the response. When I wrote a followup prompt like the one you suggested, the response from the previous prompt appeared in the chat.

This only started happening to me yesterday at 1 pm. I have had to instruct GCA to only do small fixes. Please see the several other threads on this subject that have been posted in the past 3 days. This has gotten REALLY BAD over the past week. 

My current [irritating] workaround: really small pieces at a time. Really slows me down. Whereas previous to y'day, I could do a month's worth of standard coding in a day with GCA, now that's cut down so I can do maybe 2 days worth of standard coding in a day. 

Here are the responses I need to use to keep GCA following my workaround directives. I keep them in a text file, because I have to use them constantly. 

===========================

messages when choking
----------
You choked again on too large a discussion piece plus the diffs.
 
AGAIN -- and I do get TIRED of repeating this --- because of the current bug in your system -- we have to do tiny pieces.
 
So: 
 
1. No general discussion piece more than 3 paragraphs.
 
2. No diff file added at the end of a discussion piece. Give it its own pieces. 
 
3. No diff file piece larger than 8 changes.
- - - 
You are only writing to your temp file again. Please write to my launcher.py file
- - - -
 
messages while responding
--------
Remember: after a discussion response, the next response should just be the diff file, with no more discussion.
- - -
Excellent. Next small diff, please. And remember: JUST the diff. You've already covered the discussion.
- - -
Excellent. Next small diff, please. And remember: do NOT include a diff file in a discussion response. We need to keep working around your system's bug.
==========================================================

 



Also: if anyone from Google wants to log onto my system with AnyDesk or somesuch, to observe this mess, I'm totally available for that. 6 am til midnight, Pacific time, 7 days a week. This really needs to be solved. GCA in VS Code is the most amazing progging environment I've ever used. And I've been doing this stuff since 1967, including time at MS in the 80s and 90s when Billvile was fun.