Devon icon indicating copy to clipboard operation
Devon copied to clipboard

LM Studio integration

Open mrdjohnson opened this issue 1 year ago • 3 comments

👋 I see that local LLMs are on the roadmap, are there any plans for LM Studio as well? Is there anything needed to make adding it easier?

mrdjohnson avatar May 20 '24 16:05 mrdjohnson

@mrdjohnson We have a branch with a working ollama integration, but I'm not a big user of LM Studio. However, from what I know its an endpoint right?

If you have any more context on how you'd like to use it that would be great. Also, I think this may just require a slight change to managing configs.

akiradev0x avatar May 20 '24 17:05 akiradev0x

@killind-dev LM Studio is a local platform for using LLMs, it has great support for a lot of huggingface models as well. The server it comes with is open ai api compatible. Like ollama, LM Studio would be an option to use Devon based on local generation.

mrdjohnson avatar May 21 '24 06:05 mrdjohnson

Cool, sounds like this is what we want? Just use the completions function in litellm?

akiradev0x avatar May 21 '24 07:05 akiradev0x

@mrdjohnson just merged custom model config support, so in theory this is also covered as long as the endpoint is oai compatible! Will be updating documentation soon with this information. Closing for now.

akiradev0x avatar May 27 '24 22:05 akiradev0x

@akiradev0x Excited to hear about the custom model config support! That would definitely work for LM Studio as well as Ollama, I'll work on getting LM Studio into Litellm and circle back here to try to get first party LM Studio support! It would be an honor to have our users see affiliation with awesome projects like Devon so I'll try and make it easier on your end!

Thanks for everything by the way!

mrdjohnson avatar May 28 '24 08:05 mrdjohnson