gpt-engineer
gpt-engineer copied to clipboard
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))
Tried to create a bot using the following prompt.
Write a discord bot in python that has a slash command named code that takes a prompt as input and generates files on the disk and sends those files back to the user
Got the following error.
This is an error from the generated code, not the gpt-engineer code right?:)
It might be related to the generated project but the workspace folder for the project is empty so my guess is that maybe an error in the project made it crash before generating the files?
@BirgerMoell how did you make the traceback so pretty? @AntonOsika it seems to be gpt-engineer's error since I get the same exception while generating a JS-only project.
level_loader.js
```javascript
class LevelLoader {
constructor() {
this.levels = [
// Level data in JSON format
];
}
loadLevel(level) {
return this.levels[level];
}
}
```
main.js
```javascript
const game = new Game();
game.start();
Traceback (most recent call last):
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/urllib3/response.py", line 710, i
n _error_catcher
yield
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/urllib3/response.py", line 1077,
in read_chunked
self._update_chunk_length()
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/urllib3/response.py", line 1012,
in _update_chunk_length
raise InvalidChunkLength(self, line) from None
urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/requests/models.py", line 816, in
generate
yield from self.raw.stream(chunk_size, decode_content=True)
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/urllib3/response.py", line 937, i
n stream
yield from self.read_chunked(amt, decode_content=decode_content)
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/urllib3/response.py", line 1106,
in read_chunked
self._original_response.close()
File "/usr/lib/python3.8/contextlib.py", line 131, in __exit__
self.gen.throw(type, value, traceback)
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/urllib3/response.py", line 727, i
n _error_catcher
raise ProtocolError(f"Connection broken: {e!r}", e) from e
urllib3.exceptions.ProtocolError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)
", InvalidChunkLength(got length b'', 0 bytes read))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ubuntu/src/gpt-engineer/venv/bin/gpt-engineer", line 8, in
sys.exit(app())
File "/home/ubuntu/src/gpt-engineer/gpt_engineer/main.py", line 61, in main
messages = step(ai, dbs)
File "/home/ubuntu/src/gpt-engineer/gpt_engineer/steps.py", line 121, in gen_clarified_code
messages = ai.next(messages, dbs.identity["use_qa"])
File "/home/ubuntu/src/gpt-engineer/gpt_engineer/ai.py", line 55, in next
for chunk in response:
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/openai/api_resources/abstract/eng
ine_api_resource.py", line 166, in
return (
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/openai/api_requestor.py", line 69
2, in
return (
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/openai/api_requestor.py", line 11
5, in parse_stream
for line in rbody:
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/requests/models.py", line 865, in
iter_lines
for chunk in self.iter_content(
File "/home/ubuntu/src/gpt-engineer/venv/lib/python3.8/site-packages/requests/models.py", line 818, in
generate
raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 byt
es read)", InvalidChunkLength(got length b'', 0 bytes read))
I just got this error, I already gave the program some code so I believe I know what it is writing.
invoices = relationship("InvoiceMapping", back_populatesTraceback (most recent call last):
the correct line is:
invoices = relationship("InvoiceMapping", back_populates="bill_of_lading")
I get the chunk error at the same spot every time.
urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read)
I am experiencing the same issue. I've run it roughly 20 times, and every time it has crashed. I've updated the main_prompt in many different ways as well. It's always at a different point in the code for me.
Traceback (most recent call last):
File "/Users/work/anaconda3/lib/python3.10/site-packages/urllib3/response.py", line 761, in _update_chunk_length self.chunk_left = int(line, 16)
ValueError: invalid literal for int() with base 16: b''
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/work/anaconda3/lib/python3.10/site-packages/urllib3/response.py", line 444, in _error_catcher yield
File "/Users/work/anaconda3/lib/python3.10/site-packages/urllib3/response.py", line 828, in read_chunked self._update_chunk_length()
File "/Users/work/anaconda3/lib/python3.10/site-packages/urllib3/response.py", line 765, in _update_chunk_length raise InvalidChunkLength(self, line)
urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/work/anaconda3/lib/python3.10/site-packages/requests/models.py", line 816, in generate yield from self.raw.stream(chunk_size, decode_content=True)
File "/Users/work/anaconda3/lib/python3.10/site-packages/urllib3/response.py", line 624, in stream for line in self.read_chunked(amt, decode_content=decode_content):
File "/Users/work/anaconda3/lib/python3.10/site-packages/urllib3/response.py", line 816, in read_chunked with self._error_catcher():
File "/Users/work/anaconda3/lib/python3.10/contextlib.py", line 153, in exit self.gen.throw(typ, value, traceback)
File "/Users/work/anaconda3/lib/python3.10/site-packages/urllib3/response.py", line 461, in _error_catcher raise ProtocolError("Connection broken: %r" % e, e)
urllib3.exceptions.ProtocolError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/work/anaconda3/bin/gpt-engineer", line 8, in
File "/Users/work/anaconda3/lib/python3.10/site-packages/gpt_engineer/main.py", line 60, in main messages = step(ai, dbs)
File "/Users/work/anaconda3/lib/python3.10/site-packages/gpt_engineer/steps.py", line 127, in gen_clarified_code messages = ai.next(messages, dbs.preprompts["use_qa"])
File "/Users/work/anaconda3/lib/python3.10/site-packages/gpt_engineer/ai.py", line 55, in next for chunk in response:
File "/Users/work/anaconda3/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 166, in
File "/Users/work/anaconda3/lib/python3.10/site-packages/openai/api_requestor.py", line 692, in
File "/Users/work/anaconda3/lib/python3.10/site-packages/openai/api_requestor.py", line 115, in parse_stream for line in rbody:
File "/Users/work/anaconda3/lib/python3.10/site-packages/requests/models.py", line 865, in iter_lines for chunk in self.iter_content(
File "/Users/work/anaconda3/lib/python3.10/site-packages/requests/models.py", line 818, in generate raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))
Some possible clarification (from ChatGPT) on what's going on with this issue. https://chat.openai.com/share/83b0156d-aa19-45a8-b6d5-d5baf7edfe51
@AntonOsika
Closing since I haven't heard more about this issue lately
I think the issue might still persists. Following is the exception I am receiving.
"File "$\anaconda3\envs\gpt-eng\Lib\site-packages\urllib3\response.py", line 710, in _error_catcher yield
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\urllib3\response.py", line 1077, in read_chunked self._update_chunk_length()
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\urllib3\response.py", line 1012, in _update_chunk_length raise InvalidChunkLength(self, line) from None
urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\requests\models.py", line 816, in generate yield from self.raw.stream(chunk_size, decode_content=True)
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\urllib3\response.py", line 937, in stream yield from self.read_chunked(amt, decode_content=decode_content)
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\urllib3\response.py", line 1065, in read_chunked with self._error_catcher():
File "$\anaconda3\envs\gpt-eng\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback)
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\urllib3\response.py", line 727, in _error_catcher raise ProtocolError(f"Connection broken: {e!r}", e) from e
urllib3.exceptions.ProtocolError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "
File "
File "$\gpt-engineer\gpt_engineer\main.py", line 101, in
File "$\gpt-engineer\gpt_engineer\main.py", line 91, in main messages = step(ai, dbs) ^^^^^^^^^^^^^
File "$\gpt-engineer\gpt_engineer\steps.py", line 199, in gen_clarified_code messages = ai.next(messages, dbs.preprompts["generate"], step_name=curr_fn()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "$\gpt-engineer\gpt_engineer\ai.py", line 173, in next response = self.llm(messages, callbacks=callsbacks) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\langchain\chat_models\base.py", line 551, in call generation = self.generate( ^^^^^^^^^^^^^^
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\langchain\chat_models\base.py", line 309, in generate raise e
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\langchain\chat_models\base.py", line 299, in generate self._generate_with_cache(
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\langchain\chat_models\base.py", line 446, in _generate_with_cache return self._generate( ^^^^^^^^^^^^^^^
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\langchain\chat_models\openai.py", line 333, in _generate for chunk in self._stream(
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\langchain\chat_models\openai.py", line 305, in _stream for chunk in self.completion_with_retry(
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 166, in
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\openai\api_requestor.py", line 692, in
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\openai\api_requestor.py", line 115, in parse_stream for line in rbody:
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\requests\models.py", line 865, in iter_lines for chunk in self.iter_content(
File "$\anaconda3\envs\gpt-eng\Lib\site-packages\requests\models.py", line 818, in generate raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read)) "
@AntonOsika Can you re-open this please?
I hesitated a little in reopening this, since the latest reported error goes through langchain, which is a dependency that didn't exist at the time of the original issue. Going ahead, it could be valuable with a little more information about the prompt and GPT version used @bhanupbalusu .
Closed with 5 days of triage