MiniCPM-o icon indicating copy to clipboard operation
MiniCPM-o copied to clipboard

[BUG] `.to` is not supported for `4-bit` or `8-bit` bitsandbytes models. Please use the model as it is, since the model has already been set to the correct devices and casted to the correct `dtype`.

Open JV-X opened this issue 1 month ago • 2 comments

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • [X] 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

我尝试在本地运行MiniCPM-V-2_6-int4模型,但当我下载好了模型并执行代码的时候,我得到了以下错误:

(MiniCPMo) hygx@hygx:~/code/MiniCPM-o$  cd /home/hygx/code/MiniCPM-o ; /usr/bin/env /home/hygx/anaconda3/envs/MiniCPMo/bin/python /home/hygx/.vscode-server/extensions/ms-python.debugpy-2024.14.0-linux-x64/bundled/libs/debugpy/adapter/../../debugpy/launcher 48288 -- /home/hygx/code/MiniCPM-o/chat.py 
torch_version: 2.2.0+cu121
Unused kwargs: ['_load_in_4bit', '_load_in_8bit', 'quant_method']. These kwargs are not used in <class 'transformers.utils.quantization_config.BitsAndBytesConfig'>.
`low_cpu_mem_usage` was None, now set to True since model is quantized.
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:05<00:00,  2.56s/it]
Traceback (most recent call last):
  File "/home/hygx/anaconda3/envs/MiniCPMo/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/home/hygx/anaconda3/envs/MiniCPMo/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/home/hygx/.vscode-server/extensions/ms-python.debugpy-2024.14.0-linux-x64/bundled/libs/debugpy/adapter/../../debugpy/launcher/../../debugpy/__main__.py", line 71, in <module>
    cli.main()
  File "/home/hygx/.vscode-server/extensions/ms-python.debugpy-2024.14.0-linux-x64/bundled/libs/debugpy/adapter/../../debugpy/launcher/../../debugpy/../debugpy/server/cli.py", line 501, in main
    run()
  File "/home/hygx/.vscode-server/extensions/ms-python.debugpy-2024.14.0-linux-x64/bundled/libs/debugpy/adapter/../../debugpy/launcher/../../debugpy/../debugpy/server/cli.py", line 351, in run_file
    runpy.run_path(target, run_name="__main__")
  File "/home/hygx/.vscode-server/extensions/ms-python.debugpy-2024.14.0-linux-x64/bundled/libs/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 310, in run_path
    return _run_module_code(code, init_globals, run_name, pkg_name=pkg_name, script_name=fname)
  File "/home/hygx/.vscode-server/extensions/ms-python.debugpy-2024.14.0-linux-x64/bundled/libs/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 127, in _run_module_code
    _run_code(code, mod_globals, init_globals, mod_name, mod_spec, pkg_name, script_name)
  File "/home/hygx/.vscode-server/extensions/ms-python.debugpy-2024.14.0-linux-x64/bundled/libs/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 118, in _run_code
    exec(code, run_globals)
  File "/home/hygx/code/MiniCPM-o/chat.py", line 280, in <module>
    chat_model = MiniCPMVChat(model_path)
  File "/home/hygx/code/MiniCPM-o/chat.py", line 267, in __init__
    self.model = MiniCPMV2_6(model_path, multi_gpus)
  File "/home/hygx/code/MiniCPM-o/chat.py", line 218, in __init__
    self.model = AutoModel.from_pretrained(model_path, trust_remote_code=True,
  File "/home/hygx/anaconda3/envs/MiniCPMo/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 559, in from_pretrained
    return model_class.from_pretrained(
  File "/home/hygx/anaconda3/envs/MiniCPMo/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4034, in from_pretrained
    dispatch_model(model, **device_map_kwargs)
  File "/home/hygx/anaconda3/envs/MiniCPMo/lib/python3.10/site-packages/accelerate/big_modeling.py", line 498, in dispatch_model
    model.to(device)
  File "/home/hygx/anaconda3/envs/MiniCPMo/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2883, in to
    raise ValueError(
ValueError: `.to` is not supported for `4-bit` or `8-bit` bitsandbytes models. Please use the model as it is, since the model has already been set to the correct devices and casted to the correct `dtype`.

期望行为 | Expected Behavior

能正常和模型对话

复现方法 | Steps To Reproduce

1.使用modelscope download --model OpenBMB/MiniCPM-V-2_6-int4下载模型 2.在chat.py中修改model_path = '/home/hygx/.cache/modelscope/hub/OpenBMB/MiniCPM-V-2_6-int4' 3.运行chat.py

运行环境 | Environment

- OS: WSL2 with windows 11
- Python:3.10
- Transformers:4.44.2
- PyTorch:2.2.0
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`): 12.1

备注 | Anything else?

No response

JV-X avatar Jan 15 '25 06:01 JV-X