BrokenResponseError: Can not build a coherent char history after a broken streaming response (See the previous Exception fro details). To inspect the last response object, use `chat.last`.To remove the last request/response `Content` objects from the chat call `last_send, last_received = chat.rewind()` and continue without it
Traceback (most recent call last):
File "chat.last.To remove the last request/response Content objects from the chat call last_send, last_received = chat.rewind() and continue without it.
code:
# --snip--
chat = model.start_chat()
PART_LEN = len(content) // 2
first_part = content[:PART_LEN]
second_part = content[PART_LEN:]
first_response = chat.send_message(content=MESSAGE_FOR_FIRST_PART + first_part, stream=True)
for chunk in first_response:
chunks.append(chunk)
second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)
for chunk in second_response:
chunks.append(chunk)
Typo fix: https://github.com/google-gemini/generative-ai-python/pull/313
It's trying to tell you that the first stream broke, and it's not sure what to do. Is the first part not raising an error?
The first part doesn't give any errors. The second part gives an error, previously everything worked, with identical code (I automate translation, and I was able to translate about 15 .md files), then just started to pop up this error and that's it, nothing helps.
def translate(path: str) -> str:
file = io.FileIO(file=path)
content = str(file.read())
model = genai.GenerativeModel(model_name=MODEL, generation_config=GENERATION_CONFIG)
chunks = []
chat = model.start_chat()
PART_LEN = len(content) // 2
first_part = content[:PART_LEN]
second_part = content[PART_LEN:]
first_response = chat.send_message(content=MESSAGE_FOR_FIRST_PART + first_part, stream=True)
for chunk in first_response:
chunks.append(chunk)
second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)
for chunk in second_response:
chunks.append(chunk)
result = "".join(part.text for chunk in chunks for candidate in chunk.candidates for part in candidate.content.parts)
return result
The error occurs here:
second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)