Flux.jl
Flux.jl copied to clipboard
More robust testing for AlphaDropout
We currently borrow from https://github.com/pytorch/pytorch/blob/v1.10.0/test/cpp/api/modules.cpp#L1337-L1338, but some sort of goodness-of-fit test would be more robust than a simple range. Sampling the layer outputs should have a negligible impact on test runtime.
I want to work on this issue, could you please provide me with a bit more info?
The best place to start would be the paper itself, https://arxiv.org/abs/1706.02515. The section "New Dropout Technique" talks about Alpha dropout and some of its unique properties. Testing for those would be a great way to ensure we're following the spec.
Well, that's a teeny-tiny paper, lemme read the required section and I'll get back to the tests :)
I would love to also work on this issue, read the required parts. Could I get some more reference for this
I have highlighted the appropriate sections in the text below. We want to sample many x
values and test whether the mean and variance after dropout matches the paper.
Cool will start working on it as well as soon as I get done with the contrastive loss function.