LineSearches.jl
LineSearches.jl copied to clipboard
Line search methods for optimization and root-finding
In HagerZhang and MoreThuente we throw errors if the step direction is not a descent direction (that is, d\phi(0) \geq 0). No tests are made in BackTracking and StrongWolfe, and...
It would be good to have an optimal linesearch, mostly for testing purposes (it somewhat reduces the noise in comparing two methods). I took a stab at implementing this using...
ref changes in https://github.com/JuliaNLSolvers/LineSearches.jl/pull/80 - [ ] Don't allocate vectors. We might as well have them as scalars to avoid allocations - [x] We should also free the deepest layer...
I'd like to see a flexible logging / tracing functionality here. Currently there is little info provided when things go wrong inside the linesearches. Finiteness tests such as #101 should...
like ```julia if vecnorm(s) == 0 Base.error("Step direction is zero.") end ```
- [ ] Test linesearch behaviour when alpha = NaN, Inf or negative - [ ] Add counter tests (we can create `OnceDifferentiable` objects using NLSolversBase)
- [ ] Counter: Find a situation where all the (non-trivial) linesearches require at least two evals (e.g. Himmelblau above?) - [ ] Optim usage: This should be covered by...
Hi, Sorry to create another issue with an easy question, but I was wondering if there is a way to display each iteration of a line search algorithm. The basic...
Morethuente, hagerzhang and strongwolfe all evaluate `df.g!` or `df.fg!`, and should thus report return any step that will give `g(x+alpha*p)` within tolerance from Optim, NLsolve etc.
I noticed that the same line search algorithm from the package (combined with gradient descent) produced very different convergence results in different systems. Especially on Windows, the iteration converged much...