ColossalAI
ColossalAI copied to clipboard
[BUG]: inference.py error
🐛 Describe the bug
按照example的说明跑了一下,发现inference会报格式错误。
命令:python inference.py --model_path ./actor_checkpoint_prompts.pt --pretrain bigscience/bloom-560m --model bloom
输出:
Traceback (most recent call last):
File "inference.py", line 59, in
Environment
nvcc -V Cuda compilation tools, release 11.7, V11.7.99 Build cuda_11.7.r11.7/compiler.31442593_0
example:https://github.com/hpcaitech/ColossalAI/tree/main/applications/ChatGPT/examples
Try adding strict=False to this line.
引用的那一行是无法添加strict=False这个参数的,我看了源码确实不支持。
23行我改为actor.model.load_state_dict(state_dict, strict=False)依然报错:
Traceback (most recent call last):
File "inference.py", line 60, in
I guess we can merge the issue with #3061 and request @ht-zhou's help on this.