dnw
dnw copied to clipboard
pruning
class ChooseTopEdges(autograd.Function): """ Chooses the top edges for the forwards pass but allows gradient flow to all edges in the backwards pass"""
@staticmethod
def forward(ctx, weight, prune_rate):
output = weight.clone()
_, idx = weight.flatten().abs().sort()
p = int(prune_rate * weight.numel())
flat_oup = output.flatten()
flat_oup[idx[:p]] = 0
return output
@staticmethod
def backward(ctx, grad_output):
return grad_output, None, None
Excuse me, although flat_oup is pruned because the element is set to zero, the return in forward is output. Does this play the role of pruning? @dirkgr @schmmd @iellenberger @danyaljj @danyaljj @MLatzke