Qwen
Qwen copied to clipboard
[BUG] <title> 72B模型导入报错,救命,急!!!!
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
- [X] 我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
家人们,我用两张A800-80G,我用model = AutoModelForCausalLM.from_pretrained("/home/sunyard/source/model/llm/Qwen_72B_Chat", device_map="Auto", trust_remote_code=True, fp16=True).eval()导入千问72B模型报错,错误为
,
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
- OS:centos
- Python:python3.8.1
- Transformers:4.34.1
- PyTorch:2.0.1+cu117
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):11.7
- accelarate 0.25.0
备注 | Anything else?
No response
用提供的代码,直接会报ValueError
If passing a string for `device_map`, please choose 'auto', 'balanced', 'balanced_low_0' or 'sequential'.
Auto得是小写的,不过看日志都加载完了,应该不是这里的问题。
查代码,transformers 4.34.1版本的L3246跟错误报告里的不 一 样。 https://github.com/huggingface/transformers/blob/v4.34.1/src/transformers/modeling_utils.py#L3246 你的transformers版本实际是4.33.2 ... https://github.com/huggingface/transformers/blob/v4.33.2/src/transformers/modeling_utils.py#L3246
怀疑环境已经乱了,建议conda新建一个环境看看还有没有问题。
用提供的代码,直接会报ValueError
If passing a string for `device_map`, please choose 'auto', 'balanced', 'balanced_low_0' or 'sequential'.
Auto得是小写的,不过看日志都加载完了,应该不是这里的问题。
查代码,transformers 4.34.1版本的L3246跟错误报告里的不 一 样。 https://github.com/huggingface/transformers/blob/v4.34.1/src/transformers/modeling_utils.py#L3246 你的transformers版本实际是4.33.2 ... https://github.com/huggingface/transformers/blob/v4.33.2/src/transformers/modeling_utils.py#L3246
怀疑环境已经乱了,建议conda新建一个环境看看还有没有问题。 好的好的,大佬,我试试噢