Optimisers.jl
                                
                                 Optimisers.jl copied to clipboard
                                
                                    Optimisers.jl copied to clipboard
                            
                            
                            
                        Allow keyword arguments for optimisers
In optimisers like AdamW, it is often the case that the learning rate and the weight decay are tweaked, but the momentum decay values are not (see PyTorch, for example, where the weight decay can be specified without having to specify the β values). Maybe this could be allowed?
One option would be to add Base.@kwdef:
julia> Base.@kwdef struct Nesterov{T}
         eta::T = 1f-3
         rho::T = 9f-1
       end
julia> Nesterov(rho = 0.9)
ERROR: MethodError: no method matching Nesterov(::Float32, ::Float64)
julia> Nesterov(η = 1f-3, ρ = 9f-1) = Nesterov{typeof(η)}(η, ρ);
julia> Nesterov(rho = 0.9)
Nesterov{Float32}(0.001f0, 0.9f0)
One ugly feature is that you have to type the defaults twice, is there a neat way around that?
The other, unavoidable, one is that all these names become public API & need to be documented.