Zephyr
Zephyr
Thanks for the Llama Format checkpoint. However, the output is messy when I try the example in the readme. I want to know what went wrong. The output is as...
> What's your torch, huggingface version? I can not replicate this problem. In my case, the log is: > > ```python > >>> import torch > >>> from transformers import...
Is this problem solved? I had the same problem
This works well for me. Thanks a lot.