open-interpreter
open-interpreter copied to clipboard
Open Interpreter Crashes When Trying to Generate Code
Describe the bug
Open Interpreter starts up properly and responds to my text prompts. However whenever it attempts to generate code, it crashes with the error message:
Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 86, in chat for _ in self._streaming_chat(message=message, display=display): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 113, in _streaming_chat yield from terminal_interface(self, message) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/terminal_interface/terminal_interface.py", line 135, in terminal_interface for chunk in interpreter.chat(message, display=False, stream=True): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 148, in _streaming_chat yield from self._respond_and_store() File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 194, in _respond_and_store for chunk in respond(self): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/respond.py", line 49, in respond for chunk in interpreter.llm.run(messages_for_llm): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/llm/llm.py", line 191, in run yield from run_function_calling_llm(self, params) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 66, in run_function_calling_llm arguments = parse_partial_json(arguments) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/llm/utils/parse_partial_json.py", line 8, in parse_partial_json return json.loads(s) ^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/json/__init__.py", line 339, in loads raise TypeError(f'the JSON object must be str, bytes or bytearray, ' TypeError: the JSON object must be str, bytes or bytearray, not NoneType
Reproduce
Prompt Open Interpreter to generate code
Expected behavior
Code generation
Screenshots
No response
Open Interpreter version
0.2.0
Python version
3.11.5
Operating System name and version
MacOS 12.2
Additional context
I'm using the GPT4-32k model from Azure OpenAI. I don't know if that has something to do with it.
Facing same issue
Facing the same issue, too.
Facing the same issue, too.
Same for me... To add a few more information...
- I am using Azure OpenAI with gpt4 model (1106-Preview)
- tried it with different token sizes
- used Version 2023-09-01 and 2023-12-01-preview and several others.
- exception occurs mostly with first request but randomly in the response
seems this has something to do with gpt 4 models. the regular gpt 4 works for me, but not the gpt-4-1106-preview (aka gpt 4-turbo)
I believe this has been resolved. Can you please try again with the latest version? @mayowaosibodu
I believe this has been resolved. Can you please try again with the latest version? @mayowaosibodu
Yes, recent Open Interpreter installs didn't exhibit this problem.