LeastSquaresOptim.jl icon indicating copy to clipboard operation
LeastSquaresOptim.jl copied to clipboard

Dense and Sparse Least Squares Optimization

Results 8 LeastSquaresOptim.jl issues
Sort by recently updated
recently updated
newest added

I have had a good experience using an incomplete LU as a preconditioner for some linear high-dimensional fixed-effect models. I tried to feed it to `LeastSquaresOptim`, but seem to fail....

I just feel that it would be useful to have confidence intervals be output without having to go to another package to do it manually.

Two suggestions that would make LeastSquaresOptim even better (in my humble opinion): 1) The println(result) of the optimization output could be update to match the output that Optim.jl now uses....

It would be useful to be able to calculate the function and gradient in a single function call. This is offered by other similar libraries. Is there any workaround for...

please document the keyword arguments and state their default values.

Sometimes you want to train for `n` iterations, inspect results and then add additional iterations if required. Or one may want to control iteration externally, using a package like [IterationControl.jl](https://github.com/JuliaAI/IterationControl.jl),...

Hi, I just discoverd your package and I wondered if you would like to try DifferentiationInterface's [sparse autodiff pipeline](https://juliadiff.org/DifferentiationInterface.jl/DifferentiationInterface/stable/tutorials/advanced/#Sparsity)?

MWE: ```julia using LeastSquaresOptim func(x) = sum(x.^2) optimize(func, [1.0]) ``` yields ``` Results of Optimization Algorithm * Algorithm: Dogleg * Minimizer: [0.0004882886969070914] * Sum of squares at Minimum: 0.000000 *...