generative-ai-python icon indicating copy to clipboard operation
generative-ai-python copied to clipboard

BrokenResponseError: Can not build a coherent char history after a broken streaming response (See the previous Exception fro details). To inspect the last response object, use `chat.last`.To remove the last request/response `Content` objects from the chat call `last_send, last_received = chat.rewind()` and continue without it

Open reloginn opened this issue 1 year ago • 3 comments

Traceback (most recent call last): File "", line 1, in File "/home/asahi/Проекты/Python/gemini/gemini/init.py", line 5, in main run() File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 81, in run process_directory(SRC, DST) File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 73, in process_directory translate_file(src_path=src_path, dst_path=dst_path) File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 48, in translate_file translated = translate(src_path) ^^^^^^^^^^^^^^^^^^^ File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 36, in translate second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/asahi/.cache/pypoetry/virtualenvs/gemini-UM_oiH3g-py3.12/lib/python3.12/site-packages/google/generativeai/generative_models.py", line 467, in send_message history = self.history[:] ^^^^^^^^^^^^ File "/home/asahi/.cache/pypoetry/virtualenvs/gemini-UM_oiH3g-py3.12/lib/python3.12/site-packages/google/generativeai/generative_models.py", line 686, in history raise generation_types.BrokenResponseError( google.generativeai.types.generation_types.BrokenResponseError: Can not build a coherent char history after a broken streaming response (See the previous Exception fro details). To inspect the last response object, use chat.last.To remove the last request/response Content objects from the chat call last_send, last_received = chat.rewind() and continue without it.

code:

# --snip--
chat = model.start_chat()
PART_LEN = len(content) // 2
first_part = content[:PART_LEN]
second_part = content[PART_LEN:]
first_response = chat.send_message(content=MESSAGE_FOR_FIRST_PART + first_part, stream=True)
for chunk in first_response:
        chunks.append(chunk)
second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)
for chunk in second_response:
        chunks.append(chunk)

reloginn avatar Apr 30 '24 15:04 reloginn

Typo fix: https://github.com/google-gemini/generative-ai-python/pull/313

It's trying to tell you that the first stream broke, and it's not sure what to do. Is the first part not raising an error?

MarkDaoust avatar May 02 '24 17:05 MarkDaoust

The first part doesn't give any errors. The second part gives an error, previously everything worked, with identical code (I automate translation, and I was able to translate about 15 .md files), then just started to pop up this error and that's it, nothing helps.

reloginn avatar May 03 '24 17:05 reloginn

def translate(path: str) -> str:
    file = io.FileIO(file=path)
    content = str(file.read())
    model = genai.GenerativeModel(model_name=MODEL, generation_config=GENERATION_CONFIG)
    chunks = []
    chat = model.start_chat()
    PART_LEN = len(content) // 2
    first_part = content[:PART_LEN]
    second_part = content[PART_LEN:]
    first_response = chat.send_message(content=MESSAGE_FOR_FIRST_PART + first_part, stream=True)
    for chunk in first_response:
        chunks.append(chunk)
    second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)
    for chunk in second_response:
        chunks.append(chunk)
    result = "".join(part.text for chunk in chunks for candidate in chunk.candidates for part in candidate.content.parts)
    return result

The error occurs here:

second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)

reloginn avatar May 03 '24 17:05 reloginn