Optimization.jl icon indicating copy to clipboard operation
Optimization.jl copied to clipboard

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable inte...

Results 135 Optimization.jl issues
Sort by recently updated
recently updated
newest added

It would be useful to be able to use `OptimizationBBO` without having all the output printed to stdout. In order to do this, we can use the `TraceMode` keyword argument,...

Hello, i have been lately running the docu for parameter optimization of ODEs https://sensitivity.sciml.ai/dev/ode_fitting/optimization_ode/ However, i tried to decrease the maxiters to 5 for multiple algorithms, but none of them...

The way an optimization problem with inequality constraints (written symbolically using the `OptimizationSystem` functionality of `ModelingToolkit`) is translated into an `OptimizationProblem` and conveyed to the solver wrapper (for instance `OptimizationMOI`)...

I have been trying to use Optimization.jl with Optim.jl to perform global optimization with `ParticleSwarm()`, with MethodOfLines.jl generating the data for the loss, and running the forward problem. I have...

upstream

The `NOMADOpt()` example in http://optimization.sciml.ai/dev/optimization_packages/nomad/ doesn't seem to work: ``` using Optimization using OptimizationNOMAD rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2 x0 = zeros(2) p...

There is currently no (little? Haven't tried all the solvers) support for scalar inputs, e.g. ``` julia> f = (u, p) -> u^2; u0 = 0.0; prob = OptimizationProblem(f, u0);...

Though it only focuses on hessians, combining with a gradient call from another package and hessian from https://github.com/KristofferC/HyperHessians.jl would be interesting for second order methods.

The instantiate function for most AD backends only populates a field if it is `nothing`. On the other hand, the instantiate function for NoAD only processes those fields that are...

Apart from `ForwardDiff.jl` no other AD backend supports constraints. Is this just a lack of implementation or is there a deeper issue?

The docs mention an out-of-place option, but it seems like user-provided gradients, etc. are all assumed in-place at the moment. For example, ``` using Optimization using NLopt using OptimizationNLopt obj(x,...