A warning: Using this model may not work as intended
The new version is much better, I mean it started working easily and with Vulcan offloading.
The local mode returns a warning:
[node-llama-cpp] Using this model ("~/.humanifyjs/models/Phi-3.1-mini-4k-instruct-Q4_K_M.gguf") to tokenize text with special tokens and then detokenize it resulted in a different text. There might be an issue with the model or the tokenizer implementation. Using this model may not work as intended
Should I ignore this warning or this isn't expected and I must investigate what happens ?
Yes, this warning is a known issue for now. The model should work fine, at least in my testing. I'll check later if I can find a fix for this.
I'll keep this issue open for now so I don't forget to fix it