Torch-Pruning icon indicating copy to clipboard operation
Torch-Pruning copied to clipboard

AttributeError: 'tuple' object has no attribute 'grad_fn'

Open davidray222 opened this issue 9 months ago • 1 comments

Hello, thank you for providing such a useful method, but I encountered some problems while pruning llama-7b.

Environment: python 3.10 torch 2.6.0 transformers 4.49.0 accelerate 1.5.2

command:python prune_llm.py --model huggyllama/llama-7b --pruning_ratio 0.5 --save_model "/mnt/8tb_raid/david_model/Torch-Pruning/examples/LLMs/out/"

[Build Error Details]

Traceback (most recent call last):
  File "/mnt/8tb_raid/david_model/Torch-Pruning/examples/LLMs/prune_llm.py", line 408, in <module>
    main()
  File "/mnt/8tb_raid/david_model/Torch-Pruning/examples/LLMs/prune_llm.py", line 332, in main
    pruner = tp.pruner.MetaPruner(
  File "/mnt/8tb_raid/david_model/Torch-Pruning/torch_pruning/pruner/algorithms/metapruner.py", line 134, in __init__
    self.DG = dependency.DependencyGraph().build_dependency(
  File "/mnt/8tb_raid/david_model/Torch-Pruning/torch_pruning/dependency.py", line 386, in build_dependency
    self.module2node = self._trace(
  File "/mnt/8tb_raid/david_model/Torch-Pruning/torch_pruning/dependency.py", line 799, in _trace
    module2node, o.grad_fn, gradfn2module, reused, visited=visited)
AttributeError: 'tuple' object has no attribute 'grad_fn'

I would like to ask if you know how to solve it. Could this be an issue with the environment version? Thank you!!!!!

davidray222 avatar Mar 18 '25 15:03 davidray222

Add model.config.use_cache = False before pruner. The same problem here and here

zrrraa avatar Mar 19 '25 01:03 zrrraa