Devon
Devon copied to clipboard
LM Studio integration
👋 I see that local LLMs are on the roadmap, are there any plans for LM Studio as well? Is there anything needed to make adding it easier?
@mrdjohnson We have a branch with a working ollama integration, but I'm not a big user of LM Studio. However, from what I know its an endpoint right?
If you have any more context on how you'd like to use it that would be great. Also, I think this may just require a slight change to managing configs.
@killind-dev LM Studio is a local platform for using LLMs, it has great support for a lot of huggingface models as well. The server it comes with is open ai api compatible. Like ollama, LM Studio would be an option to use Devon based on local generation.
Cool, sounds like this is what we want? Just use the completions function in litellm?
@mrdjohnson just merged custom model config support, so in theory this is also covered as long as the endpoint is oai compatible! Will be updating documentation soon with this information. Closing for now.
@akiradev0x Excited to hear about the custom model config support! That would definitely work for LM Studio as well as Ollama, I'll work on getting LM Studio into Litellm and circle back here to try to get first party LM Studio support! It would be an honor to have our users see affiliation with awesome projects like Devon so I'll try and make it easier on your end!
Thanks for everything by the way!