Flux.jl icon indicating copy to clipboard operation
Flux.jl copied to clipboard

More robust testing for AlphaDropout

Open ToucheSir opened this issue 3 years ago • 6 comments

We currently borrow from https://github.com/pytorch/pytorch/blob/v1.10.0/test/cpp/api/modules.cpp#L1337-L1338, but some sort of goodness-of-fit test would be more robust than a simple range. Sampling the layer outputs should have a negligible impact on test runtime.

ToucheSir avatar Jan 27 '22 04:01 ToucheSir

I want to work on this issue, could you please provide me with a bit more info?

aritropc avatar Feb 07 '22 17:02 aritropc

The best place to start would be the paper itself, https://arxiv.org/abs/1706.02515. The section "New Dropout Technique" talks about Alpha dropout and some of its unique properties. Testing for those would be a great way to ensure we're following the spec.

ToucheSir avatar Feb 07 '22 17:02 ToucheSir

Well, that's a teeny-tiny paper, lemme read the required section and I'll get back to the tests :)

aritropc avatar Feb 07 '22 18:02 aritropc

I would love to also work on this issue, read the required parts. Could I get some more reference for this

arcAman07 avatar Feb 19 '22 07:02 arcAman07

I have highlighted the appropriate sections in the text below. We want to sample many x values and test whether the mean and variance after dropout matches the paper. image

darsnack avatar Feb 19 '22 14:02 darsnack

Cool will start working on it as well as soon as I get done with the contrastive loss function.

arcAman07 avatar Feb 19 '22 14:02 arcAman07