crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

[BUG]CrewAI Agent Fails to Pass endpoint to LiteLLM for Ollama Provider

Open skodavalla opened this issue 10 months ago • 3 comments

Description

Bug: When using CrewAI’s Agent with an Ollama model (e.g., llama3.2:latest) and a custom endpoint (e.g., http://192.168.10.8:11434/api/generate), LiteLLM defaults to http://localhost:11434, ignoring the specified endpoint and litellm.ollama_api_base. This causes Connection refused errors if Ollama isn’t local.

Reproduction Steps:

  1. Configure an Agent:
    summarizer = Agent(
        role="Call Summarizer",
        goal="Summarize transcripts",
        llm="ollama/llama3.2:latest",
        endpoint="http://192.168.10.8:11434/api/generate",
        tools=[FileReadTool()]
    )
    
    

Steps to Reproduce

Use in a Crew: crew = Crew(agents=[summarizer], tasks=[summary_task]) result = crew.kickoff(inputs={"file_path": "transcript.txt"})

Enable LiteLLM debug logging: logging.getLogger("LiteLLM").setLevel(logging.DEBUG)

Run with Ollama at 192.168.10.8:11434, not localhost:11434.

Expected behavior

Expected Behavior: LiteLLM calls http://192.168.10.8:11434/api/generate. Actual Behavior: LiteLLM calls http://localhost:11434/api/generate, fails with [Errno 111] Connection refused.

Screenshots/Code snippets

Logs: 2025-02-24 18:22:29,684 - LiteLLM - DEBUG - POST Request Sent from LiteLLM: curl -X POST http://localhost:11434/api/generate ... 2025-02-24 18:22:29,738 - root - ERROR - LiteLLM call failed: litellm.APIConnectionError: OllamaException - [Errno 111] Connection refused

Workaround: Use direct LiteLLM call: response = litellm.completion( model="ollama/llama3.2:latest", api_base="http://192.168.10.8:11434", messages=[...] )

docker container in in ubuntu24.x

Operating System

Ubuntu 24.04

Python Version

3.11

crewAI Version

0.102.0

crewAI Tools Version

0.36.0

Virtual Environment

Venv

Evidence

pip show crewai

Name: crewai Version: 0.102.0 Summary: Cutting-edge framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks. Home-page: Author: Author-email: Joao Moura [email protected] License: Location: /usr/local/lib/python3.11/site-packages Requires: appdirs, auth0-python, blinker, chromadb, click, instructor, json-repair, json5, jsonref, litellm, openai, openpyxl, opentelemetry-api, opentelemetry-exporter-otlp-proto-http, opentelemetry-sdk, pdfplumber, pydantic, python-dotenv, pyvis, regex, tomli, tomli-w, uv Required-by: crewai-tools

pip show litellm

Name: litellm Version: 1.60.2 Summary: Library to easily interface with LLM API providers Home-page: Author: BerriAI Author-email: License: MIT Location: /usr/local/lib/python3.11/site-packages Requires: aiohttp, click, httpx, importlib-metadata, jinja2, jsonschema, openai, pydantic, python-dotenv, tiktoken, tokenizers Required-by: crewai

pip show crewai-tools

Name: crewai-tools Version: 0.36.0 Summary: Set of tools for the crewAI framework Home-page: Author: Author-email: João Moura [email protected] License: Location: /usr/local/lib/python3.11/site-packages Requires: chromadb, click, crewai, docker, embedchain, lancedb, openai, pydantic, pyright, pytube, requests Required-by:

Setting litellm.ollama_api_base globally didn’t help with CrewAI, only direct calls worked. Suggest investigating Agent’s LLM initialization or LiteLLM’s endpoint handling.

2025-02-24 18:19:19,303 - agents.summarizer - INFO - Initialized summarizer with endpoint: http://192.168.10.8:11434/api/generate 2025-02-24 18:19:42,327 - utils.transcription - INFO - Starting transcription of audio file /app/audio_input/abc30.mp3 to call_transcript_20250224_181942.txt 2025-02-24 18:22:29,661 - utils.transcription - INFO - Transcription completed for /app/audio_input/abc30.mp3 2025-02-24 18:22:29,684 - LiteLLM - DEBUG - POST Request Sent from LiteLLM: curl -X POST
http://localhost:11434/api/generate
-d '{...}' 2025-02-24 18:22:29,738 - root - ERROR - LiteLLM call failed: litellm.APIConnectionError: OllamaException - [Errno 111] Connection refused

Possible Solution

Use direct LiteLLM call:

response = litellm.completion( model="ollama/llama3.2:latest", api_base="http://192.168.10.8:11434", messages=[...] )

Additional context

Docker Container in Ubuntu 24.10 Server

skodavalla avatar Feb 25 '25 10:02 skodavalla

Try using OPENAI_API_BASE on env

and or

LLM( base_url='http://192.168.10.8:11434/api/' )

lorenzejay avatar Mar 03 '25 21:03 lorenzejay

Same error on using base_url='http://192.168.10.8:11434/api' 2025-03-26 19:18:58,428 - LiteLLM - INFO - LiteLLM completion() model= llama3.2:latest; provider = ollama 2025-03-26 19:18:58,493 - root - ERROR - LiteLLM call failed: litellm.APIConnectionError: OllamaException - [Errno 111] Connection refused 2025-03-26 19:18:58,494 - root - ERROR - Processing failed for /app/audio_input/abc145duz2.mp3: litellm.APIConnectionError: OllamaException - [Errno 111] Connection refu Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions yield File "/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py", line 236, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request raise exc from None File "/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request response = connection.handle_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 101, in handle_request raise exc File "/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 78, in handle_request stream = self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 124, in _connect stream = self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp with map_exceptions(exc_map): File "/usr/local/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [Errno 111] Connection refused

skodavalla avatar Mar 26 '25 19:03 skodavalla

@skodavalla weird.. it should works.

llm = LLM(
    model="ollama/llama3.2:latest",
    base_url="http://192.168.10.8:11434",
)

summarizer = Agent(
    role="Call Summarizer",
    goal="Summarize transcripts",
    llm=llm,
    tools=[FileReadTool()]
)

If still having troubles make sure you have not set BASE_URL as env-var. If you can, debug your env-var set using os.environ.. sometimes your shell might be out to date or loaded unexepceted stuffs

lucasgomide avatar Apr 11 '25 20:04 lucasgomide

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar May 12 '25 12:05 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar May 18 '25 12:05 github-actions[bot]