[Feature Request] Support for bitnet.cpp and BitNet models
bitnet.cpp is the official inference framework for 1-bit LLMs (e.g., BitNet b1.58). It offers a suite of optimized kernels, that support fast and lossless inference of 1.58-bit models on CPU (with NPU and GPU support coming next).
It would be awesome for LMS to support bitnet.cpp and therefore BitNet models, for on-device evaluation.
Are there any updates on this features, I saw that in llama.cpp, BitNet is supported ggml-org/llama.cpp#7931. I tried to load microsoft/bitnet-b1.58-2B-4T-gguf, but I didn't succeed.
With https://arxiv.org/abs/2504.12285 released I am curious to try that out on my home lab. LM Studio would be great for this
I am getting:
🥲 Failed to load the model
Failed to load model
error loading model: llama_model_loader: failed to load model from .lmstudio/models/microsoft/bitnet-b1.58-2B-4T-gguf/ggml-model-i2_s.gguf
+1
any update?
I'm having same issues, any update yet?! trying to run microsoft/bitnet-b1.58-2B-4T on LM Studio v0.3.14 on Win10pro
Eu tbm estou com esse problema no LM Studio. I also have this problem in LM Studio.
Same issue for me. I wish that there were actually error messages with codes so that some sort of troubleshooting could be done.
same issue
🥲 Failed to load the model
Failed to load model
error loading model: llama_model_loader: failed to load model from .lmstudio\models\microsoft\bitnet-b1.58-2B-4T-gguf\ggml-model-i2_s.gguf
Guys, can you please add BitNet support?
+1, having the same issue
+1
+1
+1, totally the same issue
+1, somone explain to me how this would get added. as a runtime? or they're just waiting for transformer library to merge some code to support bitnet models.
bitnet.cpp is very much needed... just like CPU/CUDA llama.cpp.
B1.54 models not loading/working at all. Pls add the same to LMSTUDIO. Thanks in advance.
yo regular devs do NOT make me to VIBE code this
+1
Friendly reminder: use the :thumbsup: reaction on the original post instead of posting bump comments, which can clutter up the discussion
how about this issue now? thanks
still same issue
When will this be added
After a century this issue will be open, developers don't even look at this issue. @ryan-the-crayon @rugvedS07 @yagil @will-lms @sergeichestakov @neilmehta24 @mattjcly @jlguenego @mayfer @YorkieDev @CinnamonRolls1, please react and answer if this is planned or will not be implemented.
I agree, we need confirmation if this is planned, being worked on, or not planned at all.