chainlit icon indicating copy to clipboard operation
chainlit copied to clipboard

Offline local LLM inference

Open niutech opened this issue 3 weeks ago • 0 comments
trafficstars

Is your feature request related to a problem? Please describe. I'd like to use a local LLM offline, using e.g. Transformers.js, MediaPipe or WebLLM with Chainlit.

Describe the solution you'd like I'd like Chainlit to support invoking local LLM inference in web browser, like Gradio Lite and Transformers.js.

Describe alternatives you've considered An alternative is to use Gradio Lite or WebLLM chat.

niutech avatar Oct 29 '25 21:10 niutech