RunJS icon indicating copy to clipboard operation
RunJS copied to clipboard

[FEATURE] Allow Config of Local LLM to use with AI Chat feature

Open moringaman opened this issue 1 year ago • 4 comments

I love the fact that we can now use an Open Ai api key within the new Ai Chat feature but also think that it would be great if you could add support for using an LLM hosted locally for this also. I like many developers use things like Ollama to run various models locally and this would be a great cost saving alternative, which would allow us to use models which have been trained on our own codebases.

I also wouldn't have thought that this would be that big a lift for the RunJs team.

Thanks :-)

moringaman avatar Oct 18 '24 20:10 moringaman

Thanks @moringaman. I'd like to provide local LLM support at some point.

I also wouldn't have thought that this would be that big a lift for the RunJs team.

This might be true if there were a RunJS team 😅

lukehaas avatar Oct 19 '24 10:10 lukehaas

Is it possible to add OpenAI-compatible custom model name and custom specified API interface address for AI settings in RunJS?

bisslot avatar Apr 04 '25 04:04 bisslot

Allowing RunJS to be a MCP host would enable this. I know OpenAI supports the protocol as a Server.

EdwardIrby avatar Apr 16 '25 20:04 EdwardIrby

👋 I wholeheartedly endorse the inclusion of support for locally hosted LLMs in RunJS's AI Chat functionality. I, along with many other developers, run several models locally using programs like LM Studio and Ollama. We can employ models trained on our own codebases, which is a tremendous privacy and customization advantage, in addition to the enormous cost savings this gives.

This feature, in my opinion, would be a terrific addition to RunJS and significantly increase its attractiveness to those who value offline functionality and data security.

We appreciate your consideration of this improvement. I'm excited to see this put into practice!

gildas-refacto avatar May 28 '25 10:05 gildas-refacto

@moringaman @bisslot @gildas-refacto local LLM support is available now.

lukehaas avatar Oct 12 '25 18:10 lukehaas