web-llm icon indicating copy to clipboard operation
web-llm copied to clipboard

Support for LiquidAI/LFM2-1.2B

Open JoseGhDark-commits opened this issue 4 months ago • 2 comments

Hi! It would be amazing to have support for LiquidAI, their models are very good in relation to their hardware requirements and their responses as an AI model.

JoseGhDark-commits avatar Aug 20 '25 09:08 JoseGhDark-commits

I wonder if we can just convert weights from LFM2

rimmer avatar Sep 03 '25 12:09 rimmer

LFM2-8B-A1B is my favorite. It is a Mixture-of-Experts (MoE) with 8.3B total parameters and 1.5B active parameters per token. The answers are of good quality, similar to qwen3:8b, but much faster. It runs fast even in CPU, I wonder how it would go with webllm in a smartphone.

oestape avatar Nov 04 '25 22:11 oestape