Optimization.jl icon indicating copy to clipboard operation
Optimization.jl copied to clipboard

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable inte...

Results 135 Optimization.jl issues
Sort by recently updated
recently updated
newest added

# Code Optimizing a simple function without specifying autodiff. ```julia import Pkg Pkg.activate(temp=true) Pkg.add([ Pkg.PackageSpec(name="Optimization", version="3.10.0"), Pkg.PackageSpec(name="OptimizationMOI", version="0.1.5"), Pkg.PackageSpec(name="Ipopt", version="1.1.0") ], io=devnull) import Random import Optimization, Ipopt using OptimizationMOI objective_basic(params::AbstractVector,...

The MOI backend has a number of performance problems - `findnz` is called for each evaluation of hessians, this allocates - a lot of slow indexing of sparse arrays rather...

This pull request changes the compat entry for the `NonconvexJuniper` package from `0.1` to `0.1, 0.3` for package OptimizationNonconvex. This keeps the compat entries for earlier versions. Note: I have...

When using e.g. `ADAM` optimiser, one can use a `NCycle` iterator, i.e. ```julia maxiters_wanted = 1000 train_loader = Flux.Data.DataLoader((0:100,); batchsize = 4, shuffle = true) data = ncycle(train_loader, maxiters_wanted) res...

Is there something like DiffMinChange in MATLAB that can be manually inputted in order to deal with the problem of an objective function not converging?

Hi, developers! My question is, is there any plan to support Convex.jl here with AD like [cvxpylayers](https://github.com/cvxgrp/cvxpylayers)? For AD of the solution to (convex) optimization problems, I moved to Python...

Parallelization of MultistartOptimization local optimization runs currently does not appear to be supported within Optimization.jl.