LaWGPT icon indicating copy to clipboard operation
LaWGPT copied to clipboard

按照README安装后,推理报错NotImplementedError: Cannot copy out of meta tensor; no data!

Open AnITgo opened this issue 1 year ago • 6 comments

AnITgo avatar May 24 '23 09:05 AnITgo

image

Traceback (most recent call last):
  File "/root/LaWGPT/utils/callbacks.py", line 47, in gentask
    ret = self.mfunc(callback=_callback, **self.kwargs)
  File "/root/LaWGPT/webui.py", line 140, in generate_with_callback
    model.generate(**kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/peft/peft_model.py", line 627, in generate
    outputs = self.base_model.generate(**kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/transformers/generation/utils.py", line 1518, in generate
    return self.greedy_search(
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/transformers/generation/utils.py", line 2335, in greedy_search
    outputs = self(
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 687, in forward
    outputs = self.model(
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 577, in forward
    layer_outputs = decoder_layer(
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 292, in forward
    hidden_states, self_attn_weights, present_key_value = self.self_attn(
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
    output = old_forward(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 194, in forward
    query_states = self.q_proj(hidden_states).view(bsz, q_len, self.num_heads, self.head_dim).transpose(1, 2)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/accelerate/hooks.py", line 160, in new_forward
    args, kwargs = module._hf_hook.pre_forward(module, *args, **kwargs)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/accelerate/hooks.py", line 280, in pre_forward
    set_module_tensor_to_device(module, name, self.execution_device, value=self.weights_map[name])
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/accelerate/utils/modeling.py", line 180, in set_module_tensor_to_device
    module = module.cuda(device_index)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 905, in cuda
    return self._apply(lambda t: t.cuda(device))
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 797, in _apply
    module._apply(fn)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 820, in _apply
    param_applied = fn(param)
  File "/root/anaconda3/envs/lawgpt/lib/python3.10/site-packages/torch/nn/modules/module.py", line 905, in <lambda>
    return self._apply(lambda t: t.cuda(device))
NotImplementedError: Cannot copy out of meta tensor; no data!

AnITgo avatar May 24 '23 09:05 AnITgo

同样问题,WSL2

shenyinzhe avatar May 24 '23 16:05 shenyinzhe

同樣的問題,在colab上運行時也遇到: my colab

JamieLee0510 avatar May 28 '23 13:05 JamieLee0510

https://github.com/tloen/alpaca-lora/issues/368#issuecomment-1556214618 可以参考这个解决

prettybot avatar May 30 '23 10:05 prettybot

tloen/alpaca-lora#368 (comment) 可以参考这个解决

感谢,已解决

shenyinzhe avatar May 30 '23 12:05 shenyinzhe

同樣的問題,在colab上運行時也遇到: my colab

谢谢你的脚本,参照你这个run起来了。

IterNobody avatar Jun 07 '23 11:06 IterNobody