Support MSTY Studio in the same manner as LMStudio and Ollama
Is your feature request related to a problem? Please describe.
I use MSTY Studio as my primary GUI front end to local LLM models. CoPilot does not work with MSTY's embedded Ollama server at port 10000, and does not allow specifying any LLM server port other than Ollama and LMStudio.
Describe the solution you'd like
Either directly support MSTY Studio in the same manner as Ollama and LMStudio, or preferably provide a general setting for the local server port to be specified separately in the configuration settings (and possibly allow for multiple service ports to be specified and selected ad hoc.)
Describe alternatives you've considered
There aren't any because two services can not share the same port.
Additional context
N/A
You can specify the base url with your port with either the OpenAI format provider or LM studio provider, as long as your MSTY studio serves openai compatible API.
I'll reiterate - this does NOT work. See the attached screenshots.
Can you check your server log. There's nothing to work with here.
I tried again, but this time I set the provider to Ollama instead of OpenAI-compatible - and that does work.
So I think the problem is resolved.
However, you might want to check why OpenAI-compatible does not work with MSTY's port while Ollama does even with the MSTY port. OpenAI-Compatible with MSTY's port should work the same as Ollama with MSTY's port.