[Question]: System Model Settings can not found Chat model
Self Checks
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (Language Policy).
- [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
- [x] Please do not modify this template :) and fill in all the required fields.
Describe your problem
Chat model is deepseek r1:8b of ollama in windows 10 22H2 Docker install in Linux a-VirtualBox 6.11.0-19-generic #19-24.04.1-Ubuntu SMP PREEMPT DYNAMIC Mon Feb 17 11:51:52 UTC 2 x86 64 x86 64 x86 64 NU/Linux compose version 2.20.2 RAGFlow image v0.17.2
reinstalled it many times already
You need to put some random strings in the API Token block when you add the Ollama model, otherwise it will not be shown in the system model setting list.
You need to put some random strings in the API Token block when you add the Ollama model, otherwise it will not be shown in the system model setting list.
thanks a lot ,it works! but why?
You need to put some random strings in the API Token block when you add the Ollama model, otherwise it will not be shown in the system model setting list.
but ollama is deployed locally
@a48475715 I think it is a bug. Maybe they already fix that in the latest version. @argajeng Just put something here. Your locally deployed Ollama does not need that. But ragflow will not list your model if you leave this field empty.
@a48475715 I think it is a bug. Maybe they already fix that in the latest version. @argajeng Just put something here. Your locally deployed Ollama does not need that. But ragflow will not list your model if you leave this field empty.
Yes.Thanks .I just tried calling Tongyi-Qianwen chat model through the API and was able to select it normally. I don't know why the deployed locally OLLAMA doesn't in model list!
I resolved the issue by installing version v0.15.1 full