continue
continue copied to clipboard
add Msty provider
This is the app: https://msty.app
Deploy Preview for continuedev canceled.
| Name | Link |
|---|---|
| Latest commit | aaf6d6328d14c3df9126b9e4b964e95d7894558f |
| Latest deploy log | https://app.netlify.com/sites/continuedev/deploys/65f9854cbc4e8d0008945e91 |
@ashokgelal thanks for the PR! I want to make sure I understand how Msty works before merging: is it using Ollama under the hood, or does it just match its API format? Any reason why not OpenAI? And I assume from the code here that 10000 is the default port?
It is using Ollama under the hood but maintain its own version (we check compatibility issue with the models we support, for an example) and it is also very opaque to users, if you will, as the don't have to worry about installing Ollama as we'll do that for them. Not OpenAI because we do support local models (not sure if I interpreted your question about that correctly).
10000 is the default port but if that port is reserved by some other apps, then Msty pickup next available port.
@ashokgelal all makes sense! With the OpenAI question I was just wondering why not use the OpenAI API format, since it seems most people are moving toward that, including Ollama.
Would it be easy for you to switch the base branch of this PR from main to preview, and then I'll probably just make a few tweaks?
I just switched the base branch to preview. Feel free to make any necessary tweaks and please let me know if you need anything from my side.
@ashokgelal It took me a minute to get around to this, but very excited to have Msty as an official provider! I decided to keep it in just config.json for now because the UI can get overwhelming, but if you have the chance to try this out when we make the next release i'd love to hear how it works for you. If there's any extra update we should make we'll be much faster with a PR
@sestinj Thanks for merging and sounds good. Definitely looking forward to trying it and possibly writing a blog post and a video.