optimistix icon indicating copy to clipboard operation
optimistix copied to clipboard

Nonlinear optimisation (root-finding, least squares, ...) in JAX+Equinox. https://docs.kidger.site/optimistix/

Results 19 optimistix issues
Sort by recently updated
recently updated
newest added

Fixes #33 There is some unnecessary looking lines ``` best_f, best_aux = fn(state.best_y, args) best_loss = self._to_loss(state.best_y, best_f) ``` where I would expect to just use `state.best_loss`, but the test...

Not sure if this is a bug or not, but `BestSoFarMinimiser` appears to not check the last step of the wrapped solver: ``` solver = optimistix.BestSoFarMinimiser(optimistix.BFGS(rtol=1e-5, atol=1e-5)) ret = optimistix.minimise(lambda...

refactor

Hi all, thanks for the phenomenal library. We're already using it in several statistical genetics methods in my group! I've been porting over some older code of mine to use...

question

Hi devs, looks like a really nice library. I've been looking for a Jax-native root finding method that supports `vmap` for some time. Currently I am using an external call...

question

- [ ] Add `IndirectIterativeDual` specific Newton safeguards (Conn, Gould, and Toint "Trust Region Methods" section 7.3) - [ ] Use Given's rotations to compute `diff` for different values of...

refactor

As in the paper [The Geometry of Nonlinear Least Squares, with applications to Sloppy Models and Optimization](https://arxiv.org/pdf/1010.1449.pdf) by Transtrum, Machta, and Sethna.

feature

Right now these have been implemented as standalone solvers.

refactor

- [ ] Anderson acceleration - [ ] LBFGS - [ ] Affine Powell's (unconstrained) derivative free optimisers: - [ ] UOBYQA - [ ] NEWUOA On an affine solvers:...

feature

This would allow us to use all our line searches and descents with the nonlinear CG approximate Hessian. See [Conjugate Gradient Methods with Inexact Line Search](https://www.jstor.org/stable/3689494) by Shanno.

feature