devika icon indicating copy to clipboard operation
devika copied to clipboard

Possible issue with Ollama version and response format

Open windprince opened this issue 11 months ago • 3 comments

Describe your issue

model["name"] -> model.model due to version change in Ollama (0.5.0 -> 0.5.4)

How To Reproduce

Steps to reproduce the behavior (example):

  1. python devika.py
  2. bun run start --host
  3. go to http://localhost:3001/

Expected behavior

Browser should respond and allow user to select search engine, select model, etc.

Screenshots and logs

The log file/stderr in the 'python devika.py' terminal indicates an error:

File "/devika/devika.py", line 63, in data models = LLM().list_models()

File "/devika/src/llm/llm.py", line 71, in init self.models["OLLAMA"] = [(model["name"], model["name"]) for model in ollama.models]

File "/devika/.venv/lib/python3.12/site-packages/ollama/_types.py", line 33, in getitem raise KeyError(key) KeyError: 'name'

Configuration

- OS: Linux 24.04
- Python version: 3.12.3
- Node version: v18.19.1
- bun version: 1.1.43
- ollama version: 0.5.4
- search engine: N/A
- model: N/A

Additional context

Code modified slightly from:
if ollama.client:
self.models["OLLAMA"] = [(model["name"], model["name"]) for model in ollama.models]

To:
if ollama.client:
try:
# Create tuples using the model name attribute
self.models["OLLAMA"] = [(model.model, model.model) for model in ollama.models]
except Exception as e:
print(f"Error loading Ollama models: {e}")
# Initialize with empty list if there's an error
self.models["OLLAMA"] = []

windprince avatar Jan 15 '25 15:01 windprince

I works, thanks!!

guanchao avatar Jan 16 '25 01:01 guanchao

where should i change this?

VMDProjects avatar Mar 23 '25 22:03 VMDProjects

\devika\src\llm/llm.py

saladinlorenz avatar May 14 '25 20:05 saladinlorenz