obsidian-copilot icon indicating copy to clipboard operation
obsidian-copilot copied to clipboard

Support MSTY Studio in the same manner as LMStudio and Ollama

Open richardstevenhack opened this issue 5 months ago • 4 comments

Is your feature request related to a problem? Please describe.

I use MSTY Studio as my primary GUI front end to local LLM models. CoPilot does not work with MSTY's embedded Ollama server at port 10000, and does not allow specifying any LLM server port other than Ollama and LMStudio.

Describe the solution you'd like

Either directly support MSTY Studio in the same manner as Ollama and LMStudio, or preferably provide a general setting for the local server port to be specified separately in the configuration settings (and possibly allow for multiple service ports to be specified and selected ad hoc.)

Describe alternatives you've considered

There aren't any because two services can not share the same port.

Additional context

N/A

richardstevenhack avatar Oct 27 '25 19:10 richardstevenhack

You can specify the base url with your port with either the OpenAI format provider or LM studio provider, as long as your MSTY studio serves openai compatible API.

logancyang avatar Oct 28 '25 04:10 logancyang

I'll reiterate - this does NOT work. See the attached screenshots.

Image Image

richardstevenhack avatar Oct 28 '25 04:10 richardstevenhack

Can you check your server log. There's nothing to work with here.

logancyang avatar Nov 01 '25 05:11 logancyang

I tried again, but this time I set the provider to Ollama instead of OpenAI-compatible - and that does work.

So I think the problem is resolved.

However, you might want to check why OpenAI-compatible does not work with MSTY's port while Ollama does even with the MSTY port. OpenAI-Compatible with MSTY's port should work the same as Ollama with MSTY's port.

richardstevenhack avatar Nov 01 '25 07:11 richardstevenhack