YangZyyyy
YangZyyyy
I'm encountering the same problem. The same error is reported when I batch inference `wmt22` testset with the model trained based on llama2.
`model.bfloat16()` can solve this problem
Here's my previous code, when it runs, the error was reported > RuntimeError: probability tensor contains either inf, nan or element < 0 ``` from transformers import LlamaForCausalLM model =...
加油加油,早日支持MCP!