pytorch-pooling
pytorch-pooling copied to clipboard
Some confusions about softpooling
e_x = torch.sum(torch.exp(x),dim=1,keepdim=True)
return F.avg_pool2d(x.mul(e_x), kernel_size, stride=stride).mul_(sum(kernel_size)).div_(F.avg_pool2d(e_x, kernel_size, stride=stride).mul_(sum(kernel_size)))
Dear author, thanks for your summary of pooling techniques.
I have some confusions about the softpooling in your repo.
Why do you conduct summation across the channel dimension?
And why do you .mul_(sum(kernel_size)), and then div(.mul_(sum(kernel_size))). If I can remove these two operations?
Thanks for your reply in advance.