Add GeM layer
Close https://github.com/keras-team/keras/issues/16702
The 4th page of the paper states
The pooling parameter pk can be manually set or learned since this operation is differentiable and can be part of the back-propagation.
I think we can set the 'power' as a learnable parameter instead of treating it as a hyperparameter, which is again demonstrated in the experiments in the paper itself.
The 4th page of the paper states
The pooling parameter pk can be manually set or learned since this operation is differentiable and can be part of the back-propagation.
I think we can set the 'power' as a learnable parameter instead of treating it as a hyperparameter, which is again demonstrated in the experiments in the paper itself.
@old-school-kid I see your point. We can set manually or as a learnable parameter. But I think, we need to select one option here. I mostly followed tf/models, and tf/similarity approaches, and they both choose manual passing.
IMO it seems fine to keep power as a static constructor argument.
Hi @qlzh727 / @fchollet Any update on this PR? Please. Thank you!
Hi @qlzh727 / @fchollet Any update on this PR? Please. Thank you!