supersonic
supersonic copied to clipboard
[question] How to config LLM to local Ollama service?
Describe your question
After configuring the address of ollama on the page, I clicked the "Large Model Connection Test" button, and the page displayed that the model connection failed, but I ran qwen:7b in ollama on my local machine.
Provide any additional context or information
No response
What have you tried to resolve your question
I have consulted the "Configure LLM" section on Docs, in the screenshot, only the Interface Protocol field is filled with OLLAMA, the Model Name and Base URL are both empty.
Your environment
No response
Screenshots or Logs
No response
Additional information
No response