fix: Missing provider in litellm Router initialization
Aims to solve the following issue, mentioned in #419 and #588.
Traceback (most recent call last):
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 240, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/middleware/errors.py", line 151, in __call__
await self.app(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/middleware/cors.py", line 77, in __call__
await self.app(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
await self.middleware_stack(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/routing.py", line 373, in handle
await self.app(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/routing.py", line 96, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/starlette/routing.py", line 94, in app
await func(session)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/fastapi/routing.py", line 348, in app
await dependant.call(**values)
File "opendevin/server/listen.py", line 25, in websocket_endpoint
await session.start_listening()
File "/Users/yoni/OpenDevin/opendevin/server/session.py", line 88, in start_listening
await self.create_controller(data)
File "/Users/yoni/OpenDevin/opendevin/server/session.py", line 134, in create_controller
llm = LLM(model=model, api_key=api_key, base_url=api_base)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/yoni/OpenDevin/opendevin/llm/llm.py", line 42, in __init__
self._router = Router(
^^^^^^^
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/litellm/router.py", line 198, in __init__
self.set_model_list(model_list)
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/litellm/router.py", line 2075, in set_model_list
) = litellm.get_llm_provider(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/litellm/utils.py", line 5733, in get_llm_provider
raise e
File "/Users/yoni/.local/share/virtualenvs/OpenDevin-CrrnA4bA/lib/python3.11/site-packages/litellm/utils.py", line 5720, in get_llm_provider
raise litellm.exceptions.BadRequestError( # type: ignore
litellm.exceptions.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=llama2
Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers
This seems to solve the Router initialization problem, however when we reach the following code in codeact_agent.py:
response = self.llm.completion(
messages=self.messages,
stop=["</execute>"],
temperature=0.0,
seed=42,
)
action_str: str = parse_response(response)
self.messages.append({"role": "assistant", "content": action_str})
The value in messages is:
[{'role': 'system', 'content': 'You are a helpful assistant. You will be provided access (as root) to a bash shell to...close the shell and end the conversation.\n'}, {'role': 'user', 'content': 'create a simple html page'}]
However, it seems like the model only answer the "system" role message, getting the following action_str:
Great! I'm excited to help you with your tasks. Please provide me with the first task you would like me to assist you with, and I will do my best to help you.
@johnnyaug Thank you for this. I apologize, somehow I missed your PR. You might have been on to something here. Although, doesn't litellm read the provider from env?
At the moment, we've put the use of Router in the back-burner. We've encountered other issues and we've refactored it out of the code for now, at least until we figure how to use it reliably! If you want to play with making it work, please feel free to.
On another note, methinks monologue agent is more fun. 😄