deer-flow icon indicating copy to clipboard operation
deer-flow copied to clipboard

ollama with qwen 3 14B model

Open jcl2023 opened this issue 7 months ago • 3 comments

When I run the following command with tavily search,

uv run main.py "What factors are influencing AI adoption in healthcare?"

I got the follow errors. What is the rootcause?

2025-05-12 17:05:40,017 - src.graph.nodes - INFO - Planner generating full plan 2025-05-12 17:06:30,774 - httpx - INFO - HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 200 OK" Traceback (most recent call last): File "/home/terra/Downloads/deer-flow-1/.venv/lib/python3.12/site-packages/langchain_core/output_parsers/pydantic.py", line 28, in _parse_obj return self.pydantic_object.model_validate(obj) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/terra/Downloads/deer-flow-1/.venv/lib/python3.12/site-packages/pydantic/main.py", line 627, in model_validate return cls.pydantic_validator.validate_python( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ pydantic_core._pydantic_core.ValidationError: 4 validation errors for Plan locale Field required [type=missing, input_value={'output': {'title': 'Bar...nd system efficiency.'}}, input_type=dict] For further information visit https://errors.pydantic.dev/2.10/v/missing has_enough_context Field required [type=missing, input_value={'output': {'title': 'Bar...nd system efficiency.'}}, input_type=dict] For further information visit https://errors.pydantic.dev/2.10/v/missing thought Field required [type=missing, input_value={'output': {'title': 'Bar...nd system efficiency.'}}, input_type=dict] For further information visit https://errors.pydantic.dev/2.10/v/missing title Field required [type=missing, input_value={'output': {'title': 'Bar...nd system efficiency.'}}, input_type=dict] For further information visit https://errors.pydantic.dev/2.10/v/missing

jcl2023 avatar May 13 '25 00:05 jcl2023

hi, would you like to show the part of BASIC_MODEL in the config.yml?

as i am also trying to use the local model but failed. your ollama seems working fine from the log "2025-05-12 17:06:30,774 - httpx - INFO - HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 200 OK""

my BASIC_MODEL is: BASIC_MODEL: model: "qwen2.5:7b-instruct" api_key: fake base_url: "http://localhost:11434/v1"

jimleee avatar May 14 '25 05:05 jimleee

hi, would you like to show the part of BASIC_MODEL in the config.yml?

as i am also trying to use the local model but failed. your ollama seems working fine from the log "2025-05-12 17:06:30,774 - httpx - INFO - HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 200 OK""

my BASIC_MODEL is: BASIC_MODEL: model: "qwen2.5:7b-instruct" api_key: fake base_url: "http://localhost:11434/v1"

had the same issue, ollama doesnt use v1/chat/completions: https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion

my config.yaml:

BASIC_MODEL:
  model: "llama3.2:3b"
  base_url: "http://localhost:11434/" # Local service address of Ollama, which can be started/viewed via ollama serve
  api_key: xxxx

0xlws2 avatar May 22 '25 15:05 0xlws2

hi, would you like to show the part of BASIC_MODEL in the config.yml?嗨,你想在 config.yml 中展示 BASIC_MODEL 的角色吗? as i am also trying to use the local model but failed. your ollama seems working fine from the log "2025-05-12 17:06:30,774 - httpx - INFO - HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 200 OK""因为我也在尝试使用本地模型但失败了。从日志“2025-05-12 17:06:30,774 - httpx - INFO - HTTP Request: POST http://localhost:11434/v1/chat/completions ”HTTP/1.1 200 OK“” my BASIC_MODEL is: BASIC_MODEL: model: "qwen2.5:7b-instruct" api_key: fake base_url: "http://localhost:11434/v1"我的 BASIC_MODEL 是: BASIC_MODEL: model: “qwen2.5:7b-instruct” api_key: fake base_url: “http://localhost:11434/v1”

had the same issue, ollama doesnt use v1/chat/completions:有同样的问题,ollama 不使用 v1/chat/completions: https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion

my config.yaml:  我的 config.yaml:

BASIC_MODEL:
  model: "llama3.2:3b"
  base_url: "http://localhost:11434/" # Local service address of Ollama, which can be started/viewed via ollama serve
  api_key: xxxx

I suggest you change the base_url to http://localhost:11434/v1.

CN-Linzhisen avatar Sep 23 '25 06:09 CN-Linzhisen