NLopt.jl icon indicating copy to clipboard operation
NLopt.jl copied to clipboard

tracing

Open HarlanH opened this issue 11 years ago • 10 comments

Hi, this package works well for something I'm working on! But I wonder if it'd be possible to get support for tracing, ala the store_trace/show_trace options in Optim.jl. I can sorta fake it by adding print statements to my evaluation function, but that's obviously not a great solution.

Skimming through the NLopt docs, I'm not actually sure if this is possible, but if it is, it'd be a great-to-have. Thanks!

HarlanH avatar Jun 05 '14 15:06 HarlanH

The philosophy of NLopt has always been that this sort of thing is best implemented by just adding a line or two to your objective function. (Either to print things or to store them in a global or curried parameter.) What could NLopt do for you that this couldn't?

stevengj avatar Jun 05 '14 15:06 stevengj

Well, the package could make it as easy as just adding an option, but yes, it wouldn't be that difficult to write by hand. I'll do that for now. But I do think it's a good feature that most other optimization packages that I've seen support.

HarlanH avatar Jun 05 '14 16:06 HarlanH

I'm still struggling to see what is to be gained by supporting this. How is something like:

trace_vals!(opt, true)
....
vals = get_trace_vals(opt)

so much easier than just adding:

push!(vals, val)

to your objective function, where vals is a global or a parameter?

stevengj avatar Jun 05 '14 16:06 stevengj

It's the "global or parameter" thing and the need to change (or wrap) my objective function. Globals are problematic for a variety of reasons, of course, and parameters require writing a bit of extra code in each case. And it's trickier to turn on or off tracing.

If the goal is to make it easy for people to write an objective function and optimize it while being able to watch the performance of the algorithm, having NLopt manage the process is going to be easier than requiring users to write the boilerplate.

Has every other optimization system under the sun, including Optim.jl, made a mistake somehow? I don't understand where you're coming from.

On Thu, Jun 5, 2014 at 12:47 PM, Steven G. Johnson <[email protected]

wrote:

I'm still struggling to see what is to be gained by supporting this. How is:

trace_vals(opt, true) .... vals = get_trace_vals(opt)

so much easier than just adding:

push!(vals, val)

to your objective function where vals is a global or a parameter?

— Reply to this email directly or view it on GitHub https://github.com/JuliaOpt/NLopt.jl/issues/16#issuecomment-45244411.

HarlanH avatar Jun 05 '14 16:06 HarlanH

Another use case for this is if you're calling NLopt from JuMP. In that case, JuMP generates the objective function automatically, and the user doesn't have the ability to drop in arbitrary print statements.

mlubin avatar Aug 01 '14 23:08 mlubin

@mlubin, but in that case shouldn't tracing be added to the JuMP interface?

stevengj avatar Aug 02 '14 13:08 stevengj

NLopt is the only solver at this point that doesn't have some sort of informational printouts. Also, this output typically includes relevant algorithmic information like primal/dual infeasibilities and complementarity in interior-point methods, so that's not something we could do from JuMP in a generic way.

mlubin avatar Aug 02 '14 20:08 mlubin

I'm still confused; if you are using the JuMP interface, how would you hook into an NLopt-specific method for printouts? Shouldn't there be a solver-independent JuMP interface to request "informational printouts", tracing, etc.?

stevengj avatar Aug 02 '14 22:08 stevengj

Currently, anything solver-specific is controlled by passing options to the MathProgBase solver object. So in this case it would be something like NLoptSolver(algorithm=:LD_MMA,verbose=true). It's more hairy than it seems to map options at the JuMP level in a generic way, but it's an open issue: JuliaOpt/JuMP.jl#91.

mlubin avatar Aug 02 '14 22:08 mlubin

Since JuMP generates the objective function for you, so you can't insert tracing if you want to, it seems like this should be a JuMP issue. It seems like you'd be better off adding options to JuMP to generate an objective function that does printouts, store a trace of the objective-function values, etcetera, in a common way across backends, rather than having each backend do something completely different.

stevengj avatar Aug 05 '14 14:08 stevengj