web-ui icon indicating copy to clipboard operation
web-ui copied to clipboard

Failed to connect to Ollama

Open wldgntlmn opened this issue 6 months ago • 0 comments

After modifying Ollama's default port from 11434 to another value, for example, setting OLLAMA_HOST=0.0.0.0:11411 and replacing all occurrences of 11434 with 11411 in the relevant files, the system still returns the error message ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download after submitting a task.

INFO     [agent] 🧠 Starting an agent with main_model=qwen2.5:7b +vision +memory, planner_model=qwen2.5:7b, extraction_model=None
ERROR    [src.webui.components.browser_use_agent_tab] Error setting up agent task: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download
Traceback (most recent call last):
  File "F:\browser-use\web-ui\src\webui\components\browser_use_agent_tab.py", line 529, in run_agent_task
    webui_manager.bu_agent = BrowserUseAgent(
                             ^^^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\browser_use\utils.py", line 305, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\browser_use\agent\service.py", line 269, in __init__
    self.memory = Memory(
                  ^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\browser_use\agent\memory\service.py", line 82, in __init__
    self.mem0 = Mem0Memory.from_config(config_dict=self.config.full_config_dict)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\mem0\memory\main.py", line 87, in from_config
    return cls(config)
           ^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\mem0\memory\main.py", line 46, in __init__
    self.embedding_model = EmbedderFactory.create(
                           ^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\mem0\utils\factory.py", line 66, in create
    return embedder_instance(base_config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\mem0\embeddings\ollama.py", line 32, in __init__
    self._ensure_model_exists()
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\mem0\embeddings\ollama.py", line 38, in _ensure_model_exists
    local_models = self.client.list()["models"]
                   ^^^^^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\ollama\_client.py", line 577, in list
    return self._request(
           ^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\ollama\_client.py", line 180, in _request
    return cls(**self._request_raw(*args, **kwargs).json())
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\browser-use\web-ui\.venv\Lib\site-packages\ollama\_client.py", line 126, in _request_raw
    raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download

wldgntlmn avatar Jul 01 '25 05:07 wldgntlmn