optimistix
optimistix copied to clipboard
Nonlinear optimisation (root-finding, least squares, ...) in JAX+Equinox. https://docs.kidger.site/optimistix/
Fixes #33 There is some unnecessary looking lines ``` best_f, best_aux = fn(state.best_y, args) best_loss = self._to_loss(state.best_y, best_f) ``` where I would expect to just use `state.best_loss`, but the test...
Not sure if this is a bug or not, but `BestSoFarMinimiser` appears to not check the last step of the wrapped solver: ``` solver = optimistix.BestSoFarMinimiser(optimistix.BFGS(rtol=1e-5, atol=1e-5)) ret = optimistix.minimise(lambda...
Hi all, thanks for the phenomenal library. We're already using it in several statistical genetics methods in my group! I've been porting over some older code of mine to use...
Hi devs, looks like a really nice library. I've been looking for a Jax-native root finding method that supports `vmap` for some time. Currently I am using an external call...
- [ ] Add `IndirectIterativeDual` specific Newton safeguards (Conn, Gould, and Toint "Trust Region Methods" section 7.3) - [ ] Use Given's rotations to compute `diff` for different values of...
As in the paper [The Geometry of Nonlinear Least Squares, with applications to Sloppy Models and Optimization](https://arxiv.org/pdf/1010.1449.pdf) by Transtrum, Machta, and Sethna.
Right now these have been implemented as standalone solvers.
- [ ] Anderson acceleration - [ ] LBFGS - [ ] Affine Powell's (unconstrained) derivative free optimisers: - [ ] UOBYQA - [ ] NEWUOA On an affine solvers:...
This would allow us to use all our line searches and descents with the nonlinear CG approximate Hessian. See [Conjugate Gradient Methods with Inexact Line Search](https://www.jstor.org/stable/3689494) by Shanno.