Optimisers.jl icon indicating copy to clipboard operation
Optimisers.jl copied to clipboard

Optimisers.jl defines many standard optimisers and utilities for learning loops.

Results 56 Optimisers.jl issues
Sort by recently updated
recently updated
newest added

### Motivation and description In [other contexts](https://en.wikipedia.org/wiki/Elastic_net_regularization), combining L1 and L2 regularization can be reasonable. In Optimisers, they have the same parameter name, which, if I understand correctly, will mean...

enhancement

### Motivation and description Can we implement L-BFGS? It's a quasi 2nd order method that can converge much faster, suitable for computationally intensive models with moderate number of parameters. I...

enhancement

```julia julia> using Optimisers julia> mutable struct Two{T}; x::T; y::T; Two(x::T) where T = new{T}(x) end julia> Optimisers.trainable(z::Two) = (; z.x) julia> t = Two([1,2,3.]) Two{Vector{Float64}}([1.0, 2.0, 3.0], #undef) julia>...

bug

but rather assume that the gradient has already been cumulated. See https://github.com/FluxML/Optimisers.jl/pull/192/files#r1835058503

This package is quite stable, we could tag a v1.0.0 version.