Optimization.jl
Optimization.jl copied to clipboard
Evaluate functional and gradient simultaneously
Would it be possible to combine the evaluation of the gradient and the functional in a single function? It seems that currently, the OptimizationFunction
is designed to evaluate f
and grad
separately. What I'm looking for is basically Optim.only_fg!
, for all backends.
Yeah we can add it. Not many optimizers can use it though, but at least Optim and NLopt could.
So, internally we create the fg!
and use that with Optim, so even if you are passing the separately it gets used like that https://github.com/SciML/GalacticOptim.jl/blob/42ce4320e3c2880ae49263bdd1c02da6f10b1233/src/solve/optim.jl#L91. Do you think having a separate field for it would add more value?
As far as I understand, the point is to avoid duplicate work when you need both the function value and the gradient in the same iteration. In my case, evaluating the gradient is generally expensive, with a runtime anywhere between ms to minutes depending on the size of the problem. But evaluating the gradient allows evaluating the function value without any extra work. Evaluating the function by itself without the gradient is about 1/3 as expensive as evaluating the gradient, so not a cheap operation either.
The internal combination of function value and gradient would still evaluate them separately, no?
Yes it does, so we would have to extend the interface here to allow the combined form and you'd get a 25% speedup. Not the biggest thing but something we should keep in mind.
Sounds good! Those 25% add up though, since I may need 100,000 iterations or more to converge. So total runtimes can be a week, and the difference could be a full day of wallclock time