OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

ERROR: Error condensing thoughts: No healthy deployment available, passed model=ollama/llama2

Open Ratul007 opened this issue 1 year ago • 2 comments

No model work face error after configure everything cant run and use it plz help me face this issue------------->>>>>-ERROR: Error condensing thoughts: No healthy deployment available, passed model=ollama/llama2 Traceback (most recent call last): File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 1436, in function_with_retries response = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 386, in _completion raise e File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 334, in _completion deployment = self.get_available_deployment( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 2313, in get_available_deployment raise ValueError(f"No healthy deployment available, passed model={model}") ValueError: No healthy deployment available, passed model=ollama/llama2

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/eagle/.vscode/OpenDevin/agenthub/monologue_agent/utils/monologue.py", line 31, in condense resp = llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 328, in completion raise e File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 325, in completion response = self.function_with_fallbacks(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 1419, in function_with_fallbacks raise original_exception File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 1344, in function_with_fallbacks response = self.function_with_retries(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 1496, in function_with_retries raise e File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 1462, in function_with_retries response = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 386, in _completion raise e File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 334, in _completion deployment = self.get_available_deployment( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.cache/pypoetry/virtualenvs/opendevin-nolN6OfP-py3.11/lib/python3.11/site-packages/litellm/router.py", line 2313, in get_available_deployment raise ValueError(f"No healthy deployment available, passed model={model}") ValueError: No healthy deployment available, passed model=ollama/llama2

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/eagle/.vscode/OpenDevin/opendevin/controller/agent_controller.py", line 113, in step action = self.agent.step(self.state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/eagle/.vscode/OpenDevin/agenthub/monologue_agent/agent.py", line 153, in step self._add_event(prev_action.to_dict()) File "/home/eagle/.vscode/OpenDevin/agenthub/monologue_agent/agent.py", line 96, in _add_event self.monologue.condense(self.llm) File "/home/eagle/.vscode/OpenDevin/agenthub/monologue_agent/utils/monologue.py", line 36, in condense raise RuntimeError(f"Error condensing thoughts: {e}") RuntimeError: Error condensing thoughts: No healthy deployment available, passed model=ollama/llama2

OBSERVATION: Error condensing thoughts: No healthy deployment available, passed model=ollama/llama2 Exited before finishing

Ratul007 avatar Apr 05 '24 18:04 Ratul007

I have also experienced this issue. In the front-end it also appears like the configuration is not being used because Devin says "Oops. Something went wrong: OpenAIException - 404 page not found".

LLM_MODEL="ollama/gemma:latest"
LLM_API_KEY="ollama"
LLM_EMBEDDING_MODEL="local"
LLM_BASE_URL="http://localhost:11434"
WORKSPACE_DIR="./workspace"

But I'm not configured to use OpenAI. Which LLM_MODEL I change it to, doesn't matter.

askpatrickw avatar Apr 05 '24 22:04 askpatrickw

See https://github.com/OpenDevin/OpenDevin/issues/793

lowlyocean avatar Apr 05 '24 23:04 lowlyocean

This should be fixed now. If not, let us know!

rbren avatar Apr 07 '24 20:04 rbren