Possible to use LM Studio or Ollama ???
Is there a way to use LM Studio models ?? Web page says compatible local models, but how to use ???
If possible to use Ollama or LM Studio, can someone please describe ?? Because in the config area there is no input area to fill in local LLM settings.
Theoretically, it is possible, but the actual use effect is very poor at present. Over time, we will try to optimize the performance of small models
Theoretically, it is possible, but the actual use effect is very poor at present. Over time, we will try to optimize the performance of small models
Plz send instructions how do it, thnx!
Both ollama and LM Studio support using openai-compatible api calls. Just set openai's baseurl to the url of the corresponding service. More detailed tutorials may not be considered until we have finished optimizing the effect of the small model. But it's not difficult, you can try it yourself