open-interpreter
open-interpreter copied to clipboard
'Unrecognized request argument supplied: functions' when using azure-openai
Describe the bug
I'm able to set up AZURE_API_BASE
, AZURE_API_KEY
, AZURE_API_VERSION
but using interpreter still has trouble using azure openai:
Traceback (most recent call last):
File "/Users/zhh210/installs/anaconda3/bin/interpreter", line 8, in <module>
sys.exit(interpreter.start_terminal_interface())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/core.py", line 25, in start_terminal_interface
start_terminal_interface(self)
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 684, in start_terminal_interface
interpreter.chat()
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/core.py", line 86, in chat
for _ in self._streaming_chat(message=message, display=display):
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/core.py", line 113, in _streaming_chat
yield from terminal_interface(self, message)
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/terminal_interface/terminal_interface.py", line 135, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/core.py", line 148, in _streaming_chat
yield from self._respond_and_store()
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/core.py", line 194, in _respond_and_store
for chunk in respond(self):
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/respond.py", line 49, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/llm/llm.py", line 191, in run
yield from run_function_calling_llm(self, params)
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
for chunk in llm.completions(**request_params):
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/llm/llm.py", line 224, in fixed_litellm_completions
raise first_error
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/llm/llm.py", line 205, in fixed_litellm_completions
yield from litellm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/litellm/utils.py", line 2171, in wrapper
raise e
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/litellm/utils.py", line 2078, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/litellm/main.py", line 1808, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/litellm/utils.py", line 6713, in exception_type
raise e
File "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/litellm/utils.py", line 6610, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: AzureException - Error code: 404 - {'error': {'message': 'Unrecognized request argument supplied: functions', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Reproduce
- Set up env variables
- Run
interpreter --model azure/gpt-35-turbo
Expected behavior
output expected code and ask if execute
Screenshots
No response
Open Interpreter version
0.2.0
Python version
3.11
Operating System name and version
macOS Darwin
Additional context
No response
I have the same problem.It is likely the version of your deployment doesn't support function-calling. I have made a pull request to enable passing an argument --no-llm_supports_functions whick will make OP to treat function_calls correctlly. Before this problem is fixed, you can manually set it to fmake it work for you for now.
Open "/Users/zhh210/installs/anaconda3/lib/python3.11/site-packages/interpreter/core/llm/llm.py"
Find line 30, change from self.supports_functions = None to self.supports_functions = False
try "llm.supports_functions: False" in your config.yaml