LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

Error loading certain models with "no triggers set for lazy grammar"

Open johndev168 opened this issue 3 months ago • 0 comments

LocalAI version: 3.6.0 with latest commit being #6438

Environment, CPU architecture, OS, and Version: Mac T8132 MacOS 15.5 24F74 no VM

Describe the bug When trying to load certain models localai throws an error and refuses to load the model. This happened with ai21labs_AI21-Jamba-Reasoning-3B-Q4_K_M when trying to load the model.

ERR Stream ended with error: rpc error: code = InvalidArgument desc = Error: no triggers set for lazy grammar!

To Reproduce Download the model ai21labs_ai21-jamba-reasoning-3b first. I changed backend to metal-llama-cpp but it won't change a thing, even normal llama-cpp won't work. When switching to the chat tab and asking the model anything, it won't respond. In console the error quoted above will be thrown and the model won't be used.

Expected behavior The model should be loaded and respond to the request.

Logs

9:09AM ERR guessDefaultsFromFile(TotalAvailableVRAM): gpuFillInfo not implemented on darwin
9:09AM INF Success ip=10.20.111.109 latency=139.045167ms method=POST status=200 url=/v1/chat/completions
9:09AM INF BackendLoader starting backend=metal-llama-cpp modelID=ai21labs_ai21-jamba-reasoning-3b o.model=ai21labs_AI21-Jamba-Reasoning-3B-Q4_K_M.gguf
Error rpc error: code = InvalidArgument desc = Error: no triggers set for lazy grammar!
9:09AM ERR Stream ended with error: rpc error: code = InvalidArgument desc = Error: no triggers set for lazy grammar!

Additional context This is a model-specific issue. Loading a different model like gemma won't cause this issue.

johndev168 avatar Oct 13 '25 07:10 johndev168