Non-local_pytorch icon indicating copy to clipboard operation
Non-local_pytorch copied to clipboard

Softmax activation

Open LDOUBLEV opened this issue 6 years ago • 2 comments

If you test Softmax instead scaling by 1/N

LDOUBLEV avatar Dec 17 '18 12:12 LDOUBLEV

Do you mean the dot product version? I have not tested it yet.

The paper directly applies 1/N to normalize f to simplify gradient computation and conducts experiments to demonstrate that the performances of different version is similar.

But I think normalization by softmax maybe better than by 1/N. If you are interested in it, you can do this experiment and share the result to me, thanks!

AlexHex7 avatar Dec 19 '18 08:12 AlexHex7

@LDOUBLEV so have tested the Softmax instead of 1/N for dot product and concatenation version?

JiayongO-O avatar May 23 '19 09:05 JiayongO-O