Flux.jl
Flux.jl copied to clipboard
`Nil` doesn't understand SpecialFunctions functions
This breaks outputsize when a model uses any functions from that package. For example, https://github.com/FluxML/NNlib.jl/pull/629#issuecomment-2640970794. The easiest fix is to copy https://github.com/FluxML/Flux.jl/blob/009d9841960ac15d9a02499ac6e341e777dedf34/src/outputsize.jl#L25-L30 for one or more functions in SpecialFunctions, but I'm wondering if there's a better way.
For the immediate problem of gelu, Flux could simply add methods to all NNlib activation functions.
I don't remember quite why we decided against missing for this, maybe because it doesn't subtype Real? But it does obey erf(missing) === missing.
I don't remember either, but the Number subtype explanation makes sense.