Traceback (most recent call last):
File "/data3/push_recall/LLM-Tuning/chatglm2_lora_tuning.py", line 153, in
main()
File "/data3/push_recall/LLM-Tuning/chatglm2_lora_tuning.py", line 93, in main
model = AutoModel.from_pretrained(
File "/data3/env/miniconda3/envs/baichuan/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 462, in from_pretrained
return model_class.from_pretrained(
File "/data3/env/miniconda3/envs/baichuan/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2828, in from_pretrained
dispatch_model(model, device_map=device_map, offload_dir=offload_folder, offload_index=offload_index)
File "/data3/env/miniconda3/envs/baichuan/lib/python3.9/site-packages/accelerate/big_modeling.py", line 370, in dispatch_model
attach_align_device_hook_on_blocks(
File "/data3/env/miniconda3/envs/baichuan/lib/python3.9/site-packages/accelerate/hooks.py", line 478, in attach_align_device_hook_on_blocks
add_hook_to_module(module, hook)
File "/data3/env/miniconda3/envs/baichuan/lib/python3.9/site-packages/accelerate/hooks.py", line 155, in add_hook_to_module
module = hook.init_hook(module)
File "/data3/env/miniconda3/envs/baichuan/lib/python3.9/site-packages/accelerate/hooks.py", line 251, in init_hook
set_module_tensor_to_device(module, name, self.execution_device)
File "/data3/env/miniconda3/envs/baichuan/lib/python3.9/site-packages/accelerate/utils/modeling.py", line 140, in set_module_tensor_to_device
raise ValueError(f"{tensor_name} is on the meta device, we need a value
to put in on {device}.")
ValueError: weight is on the meta device, we need a value
to put in on 0.
请问这个什么导致的?谢谢哈~
这个好像是个accelerate包的一些问题,你可以先自行查阅一下,比如https://discuss.huggingface.co/t/meta-device-error-while-instantiating-model/33402
另外你使用的各种包的版本跟我的一致吗?你的device是什么?