ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Question]: System Model Settings can not found Chat model

Open argajeng opened this issue 10 months ago • 6 comments

Self Checks

  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (Language Policy).
  • [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
  • [x] Please do not modify this template :) and fill in all the required fields.

Describe your problem

Chat model is deepseek r1:8b of ollama in windows 10 22H2 Docker install in Linux a-VirtualBox 6.11.0-19-generic #19-24.04.1-Ubuntu SMP PREEMPT DYNAMIC Mon Feb 17 11:51:52 UTC 2 x86 64 x86 64 x86 64 NU/Linux compose version 2.20.2 RAGFlow image v0.17.2

Image

argajeng avatar Mar 14 '25 03:03 argajeng

reinstalled it many times already

argajeng avatar Mar 14 '25 03:03 argajeng

You need to put some random strings in the API Token block when you add the Ollama model, otherwise it will not be shown in the system model setting list.

chen-wu avatar Mar 14 '25 03:03 chen-wu

You need to put some random strings in the API Token block when you add the Ollama model, otherwise it will not be shown in the system model setting list.

thanks a lot ,it works! but why?

a48475715 avatar Mar 14 '25 06:03 a48475715

You need to put some random strings in the API Token block when you add the Ollama model, otherwise it will not be shown in the system model setting list.

but ollama is deployed locally

Image

argajeng avatar Mar 17 '25 01:03 argajeng

@a48475715 I think it is a bug. Maybe they already fix that in the latest version. @argajeng Just put something here. Your locally deployed Ollama does not need that. But ragflow will not list your model if you leave this field empty.

chen-wu avatar Mar 17 '25 03:03 chen-wu

@a48475715 I think it is a bug. Maybe they already fix that in the latest version. @argajeng Just put something here. Your locally deployed Ollama does not need that. But ragflow will not list your model if you leave this field empty.

Yes.Thanks .I just tried calling Tongyi-Qianwen chat model through the API and was able to select it normally. I don't know why the deployed locally OLLAMA doesn't in model list!

argajeng avatar Mar 17 '25 06:03 argajeng

I resolved the issue by installing version v0.15.1 full

argajeng avatar Mar 18 '25 09:03 argajeng