Mohamed Tarek
Mohamed Tarek
ah sorry I missed that I had to replace the options
I can reproduce all the errors and non-convergence bugs you mentioned. Time to dig in...
Please move the renaming to NonconvexSearch then update the docs here.
Please move this PR to NonconvexCore.jl.
Welcome! This is the right place. The `hvp` option is exactly for this kind of use case. Basically, the custom Hessian can also be a function that does a Hessian-vector...
This is why I use `import NLopt` in the docs, not `using`. I don't think there is a better way to resolve this.
No because we can't extend NLopt's optimize without depending on it which I don't (it's an optional dependency) and NLopt can't do the same without depending on Nonconvex which also...
Would be nice to have a warning in the docs saying that you must use `import` not `using`.
If you use `using`, you will need to use `Nonconvex.optimize` later.
This should work: ```julia using JuMP import Nonconvex Nonconvex.@load Ipopt model = JuMP.Model(); @variable(model, x >= 0) ncvx_model = DictModel(model) optimize(ncvx_model, IpoptAlg(), Dict(:x => 0.0); options = IpoptOptions()) ``` Note...