MiniCPM-V
MiniCPM-V copied to clipboard
Expected all tensors to be on the same device, but found at least two devices
device_map='auto'
File "MiniCPM-Llama3-V-2_5/modeling_minicpmv.py", line 416, in chat res, vision_hidden_states = self.generate( File "MiniCPM-Llama3-V-2_5/modeling_minicpmv.py", line 326, in generate ) = self.get_vllm_embedding(model_inputs) File "MiniCPM-Llama3-V-2_5/modeling_minicpmv.py", line 148, in get_vllm_embedding cur_vllm_emb.scatter_(0, image_indices.view(-1, 1).repeat(1, cur_vllm_emb.shape[-1]), File "torch/utils/_device.py", line 78, in torch_function return func(*args, **kwargs) RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cuda:1! (when checking argument for argument src in method wrapper_CUDA_scatter__src)
came across the same issue
This issue does not provide a reproducible context and requires more information to help resolve it. If you still need assistance, please provide your code environment and running code to help us reproduce the issue
This issue does not provide a reproducible context and requires more information to help resolve it. If you still need assistance, please provide your code environment and running code to help us reproduce the issue
你好,可以看下这个issues里面的问题吗?https://github.com/OpenBMB/MiniCPM-V/issues/260
和这里的问题一样。LoRA后,模型推理失败,加载模型代码:
model = AutoPeftModelForCausalLM.from_pretrained(
path_to_adapter,
device_map="auto",
trust_remote_code=True).eval()
(1) 报错 File "/*******/entry/minicpm-eval-lora.py", line 57, in
(2)若device_map = 'cuda',加载模型会失败,报OOM