Diff-Pruning
Diff-Pruning copied to clipboard
LDM pruning bug issue
Hi, thank you for your awesome pruning work. I have had hard time to fix an issue.
I just followed the sample code you offers in #6. After pruning, "pruend_macs, pruned_params = tp.utils.count_ops_and_params(model.model.diffusion_model, example_inputs)" results in a error. I found that the pruning, which is executed by "pruner.step()", omits all layers in BasicTransformerBlock instances, thereby making error in feedforward process.
Please let me know how do I fix it ..
I found solution. For some reason, checkpoint wrapper in BasicTransformerBlock results in the aforementioned problem. By removing the wrapper, I solved the problem.