supersonic icon indicating copy to clipboard operation
supersonic copied to clipboard

[question] How to config LLM to local Ollama service?

Open RTException opened this issue 7 months ago • 1 comments

Describe your question

After configuring the address of ollama on the page, I clicked the "Large Model Connection Test" button, and the page displayed that the model connection failed, but I ran qwen:7b in ollama on my local machine. image image

Provide any additional context or information

No response

What have you tried to resolve your question

I have consulted the "Configure LLM" section on Docs, in the screenshot, only the Interface Protocol field is filled with OLLAMA, the Model Name and Base URL are both empty.

Your environment

No response

Screenshots or Logs

No response

Additional information

No response

RTException avatar Jul 17 '24 08:07 RTException