ragflow
ragflow copied to clipboard
[Bug]: When adding a rerank model with ollama, a KeyError('Ollama') is prompted
Is there an existing issue for the same bug?
- [x] I have checked the existing issues.
RAGFlow workspace code commit ID
none
RAGFlow image version
v0.16.0 slim
Other environment information
Actual behavior
When adding the rerank model with ollama, it prompts KeyError('Ollama'), but everything is normal when I call the api curl -s localhost:11434/api/embed -d '{"model":"linux6200/bge-reranker-v2-m3:latest","input":"The weather is nice today"}'
, and I also added other embedding model with ollama and it was successful
Expected behavior
No response
Steps to reproduce
Use Ollama to add a rerank model
Additional information
2025-02-23 11:07:48,705 INFO 15 172.19.0.7 - - [23/Feb/2025 11:07:48] "GET /v1/llm/factories HTTP/1.1" 200 -
2025-02-23 11:07:49,493 ERROR 15 'Ollama'
Traceback (most recent call last):
File "/ragflow/.venv/lib/python3.10/site-packages/flask/app.py", line 880, in full_dispatch_request
rv = self.dispatch_request()
File "/ragflow/.venv/lib/python3.10/site-packages/flask/app.py", line 865, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) # type: ignore[no-any-return]
File "/ragflow/.venv/lib/python3.10/site-packages/flask_login/utils.py", line 290, in decorated_view
return current_app.ensure_sync(func)(*args, **kwargs)
File "/ragflow/api/utils/api_utils.py", line 170, in decorated_function
return func(*_args, **_kwargs)
File "/ragflow/api/apps/llm_app.py", line 238, in add_llm
mdl = RerankModel[factory](
KeyError: 'Ollama'
2025-02-23 11:07:49,494 INFO 15 172.19.0.7 - - [23/Feb/2025 11:07:49] "POST /v1/llm/add_llm HTTP/1.1" 200 -