Lorenzo Stella
Lorenzo Stella
Hi @NicolaiHarich, as you noticed, incremental training is currently not supported. I will update this issue in case there are updates on supporting that training strategy.
@sujayramaiah incremental training is not supported unfortunately
@ShirleyMgit there are no updates
In that example, a function can be passed as `f` argument to the proximal-gradient-based algorithms, since gradients will be taken with respect to `f` via automatic differentiation. For other algorithms,...
> It seems as if ProximalAlgorithms.DRLS(tol = 10 * TOL, directions=acc) doesn't care about the values of the tol parameter. Do you have an example? It definitely cares about the...
How big is it? Otherwise you could upload the data somewhere and share the link here, in case it’s fine to have the data publicly available
@patwa67 looks like there is a bug in the way DRLS displays progress: https://github.com/JuliaFirstOrder/ProximalAlgorithms.jl/blob/3de04a9b2878925f8307f756d66fbe528afb2e5a/src/algorithms/drls.jl#L188-L190 Fixing it, thanks!
Hey! I think so, see also the discussion in #71, where this was suggested already. I guess one question is: how would this look from the user perspective? Some kind...
Maybe, maybe not: I see that AbstractDifferentiation depends on ReverseDiff, maybe we could simply have that as default backend? Meaning that, when given a generic function `f` the `gradient` operation...
Yes. Then the package would probably not work out of the box in case no AD backend is installed (like in a fresh environment): maybe there is a way to...