kind of weird responses GGGGGGGGGGGG....
== Running in interactive mode. ==
- Press Ctrl+C to interject at any time.
- Press Return to return control to the AI.
- To return control without starting a new line, end your input with '/'.
- If you want to submit another line, end your input with ''.
System: You are a helpful assistant
hi GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG
what is 229 *34+-45 GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG
956 is what kind of number GGGGGGGGG
if it not BitNet model it likely means that conversion gone lightly wrong it can just be that one of the layers got the additional neuron connected that cause it, and if you can can you give more ingo like your os, arch, model name and quantanization type
I also got this kind of response.
System: You are a helpful assistant
> Hi
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
> Who are you
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
Inference on Mac M1 pro
Model: bitnet-b1.58-2B-4T-gguf
Inference using this command from docs python run_inference.py -m models/BitNet-b1.58-2B-4T/ggml-model-i2_s.gguf -p "You are a helpful assistant" -cnv
i thinking it either problem with tokenizer or problem with not use NL (nonlinear) quantization by bitnet team. And i recommend try to not use -cnv flag because in docs it says for instruct models but bitnet a QAT pre-trained using one time training
This issue only happened when I not type prompt then pressing enter multiple times to give AI control. But when I typed prompt then issue was disappeared.
Oh, then I see it the error of Pipeline processing empty responses. It debug able only by add jng a lot of debug into Pipeline and model it self. :(
Oh, then I see it the error of Pipeline processing empty responses. It debug able only by add jng a lot of debug into Pipeline and model it self. :(
Thats right. If you can make an PR which fixed error of pipeline processing empty responses. That would be helpful.