[Bug]: "Agent encountered an error." when starting start.sh on MacOS
Is there an existing issue for the same bug?
- [X] I have checked the troubleshooting document at https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting
- [X] I have checked the existing issues.
Describe the bug
getting a "Agent encountered an error." message on my first try. created the start.sh file and started the docker engine, i also have entered the right configuration settings. (screenshot attached)
Current Version
ghcr.io/opendevin/opendevin:0.5
Installation and Configuration
OpenAi key,
chmod +x start.sh
./start.sh
Model and Agent
Model : gpt-3.5-turbo Agnet : CodeActAgent
Reproduction Steps
No response
Logs, Errors, Screenshots, and Additional Context
Can you provide the whole error log?
docker logs container_id
i have the same problem, here is my example:
19:33:29 - PLAN
build a simple hello world
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Provider List: https://docs.litellm.ai/docs/providers
19:33:29 - opendevin:ERROR: agent_controller.py:147 - Error in loop
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/azure.py", line 175, in completion
if "gateway.ai.cloudflare.com" in api_base:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument of type 'NoneType' is not iterable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 882, in completion
response = azure_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/azure.py", line 316, in completion
raise AzureOpenAIError(status_code=500, message=str(e))
litellm.llms.azure.AzureOpenAIError: argument of type 'NoneType' is not iterable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/opendevin/controller/agent_controller.py", line 142, in _run
finished = await self.step(i)
^^^^^^^^^^^^^^^^^^
File "/app/opendevin/controller/agent_controller.py", line 256, in step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/agenthub/codeact_agent/codeact_agent.py", line 223, in step
response = self.llm.completion(
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 330, in wrapped_f
return self(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 467, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 368, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 390, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 470, in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/opendevin/llm/llm.py", line 188, in wrapper
resp = completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3222, in wrapper
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 3116, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2226, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 9233, in exception_type
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 9170, in exception_type
raise APIError(
litellm.exceptions.APIError: AzureException - argument of type 'NoneType' is not iterable
Model: gpt-35-turbo
Messages: [{'role': 'system', 'content': 'A chat between a curious user and an artificial intelligence assista
19:33:29 - opendevin:INFO: agent_controller.py:190 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR
19:33:30 - opendevin:INFO: browser_env.py:51 - Browser env started.
(.venv) <myuser>@MACHINENAME GitHub % ```
i have the same problem, here is my example:
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/azure.py", line 175, in completion if "gateway.ai.cloudflare.com" in api_base:
This doesn't seem to be the current litellm version. Should correlate to this: https://github.com/BerriAI/litellm/blob/3ef3a7ba5fd5c036601712e73e506686ac390687/litellm/llms/azure.py#L313
If you are using Cloudflare, this might also be of interest: https://developers.cloudflare.com/workers-ai/configuration/open-ai-compatibility/
@arasnuri can you try the latest main tag and see if this resolves your issue?
Going to close this. Please try out the new 0.7 tag or main and open a new issue if you encounter one.