pytorch-pooling icon indicating copy to clipboard operation
pytorch-pooling copied to clipboard

Some confusions about softpooling

Open c-yn opened this issue 2 years ago • 0 comments

e_x = torch.sum(torch.exp(x),dim=1,keepdim=True) return F.avg_pool2d(x.mul(e_x), kernel_size, stride=stride).mul_(sum(kernel_size)).div_(F.avg_pool2d(e_x, kernel_size, stride=stride).mul_(sum(kernel_size)))

Dear author, thanks for your summary of pooling techniques.

I have some confusions about the softpooling in your repo.

Why do you conduct summation across the channel dimension?

And why do you .mul_(sum(kernel_size)), and then div(.mul_(sum(kernel_size))). If I can remove these two operations?

Thanks for your reply in advance.

c-yn avatar Nov 23 '22 23:11 c-yn