RunJS
RunJS copied to clipboard
[FEATURE] Allow Config of Local LLM to use with AI Chat feature
I love the fact that we can now use an Open Ai api key within the new Ai Chat feature but also think that it would be great if you could add support for using an LLM hosted locally for this also. I like many developers use things like Ollama to run various models locally and this would be a great cost saving alternative, which would allow us to use models which have been trained on our own codebases.
I also wouldn't have thought that this would be that big a lift for the RunJs team.
Thanks :-)
Thanks @moringaman. I'd like to provide local LLM support at some point.
I also wouldn't have thought that this would be that big a lift for the RunJs team.
This might be true if there were a RunJS team 😅
Is it possible to add OpenAI-compatible custom model name and custom specified API interface address for AI settings in RunJS?
Allowing RunJS to be a MCP host would enable this. I know OpenAI supports the protocol as a Server.
👋 I wholeheartedly endorse the inclusion of support for locally hosted LLMs in RunJS's AI Chat functionality. I, along with many other developers, run several models locally using programs like LM Studio and Ollama. We can employ models trained on our own codebases, which is a tremendous privacy and customization advantage, in addition to the enormous cost savings this gives.
This feature, in my opinion, would be a terrific addition to RunJS and significantly increase its attractiveness to those who value offline functionality and data security.
We appreciate your consideration of this improvement. I'm excited to see this put into practice!
@moringaman @bisslot @gildas-refacto local LLM support is available now.