OptimizationProblems.jl icon indicating copy to clipboard operation
OptimizationProblems.jl copied to clipboard

Concise code for `britgas`

Open gdalle opened this issue 1 year ago • 10 comments

I am trying to reverse-engineer britgas to make it more readable instead of the current 4297-LOC mess. Not only would this ease maintenance, it would also reduce the burden on the compiler, as we figured out when britgas crashed SparseConnectivityTracer.

Needs tests

gdalle avatar Jun 14 '24 11:06 gdalle

Thanks @gdalle ! It would be a good add. The code was automatically generated from an Ampl model. Could you provide a small Julia script that test if the constraint and function values are the same for a number of random points?

tmigot avatar Jun 14 '24 18:06 tmigot

Yes that's what I meant by "needs tests", I haven't even run my code so it's probably wrong as hell, hence the draft status. But tbh I think it's a lost cause.

OptimizationProblems.jl contains 120 000 lines of code for a couple hundreds of problems, and it took me (an optimization and Julia expert) several hours to simplify just one of them. That was partly because the code was autogenerated and partly because I couldn't find a mathematical formulation for the problem. But still, doing this for every problem is absolutely unthinkable, unless they're all much simpler than this one.

gdalle avatar Jun 14 '24 19:06 gdalle

Yes, this is difficult to do. I didn't plan to do it unless we have a very good cause. Do you have more problems like this or is it the only one?

tmigot avatar Jun 14 '24 19:06 tmigot

No, britgas was the only misbehaving one and it's no longer an issue. We eventually figured out what was causing the compilation burden in SparseConnectivityTracer (unnecessary inlining), and sparsity detection works once again. As a result, I don't know if this PR is worth the sweat. However, when I look into the britgas code and see other issues like https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl/issues/247, I do wonder if the OptimizationProblems.jl suite is typical of normal Julia code. No use of linear algebra whatsoever, but thousands of lines of unwrapped for loops. I'd be curious to know what you think.

gdalle avatar Jun 14 '24 19:06 gdalle

Keep in mind that OptimizationProblems is still WIP and "massively" under-staffed. Our first goal was to add optimization problems that are typically used in the research literature, and we try to add new problems when possible. It doesn't mean that it covers a large variety of Julia function. For instance, functions like norm or log are typically non-differentiable and were not frequent in test problems from the literature.

tmigot avatar Jun 14 '24 20:06 tmigot

The code was automatically generated from an Ampl model.

More precisely it was automatically generated from an Ampl model that was automatically generated from a SIF model.

dpo avatar Jun 14 '24 20:06 dpo

Of course, and it's already a very valuable resource! I'm just being a little more careful about its breadth of coverage than I used to be. But if you're aware of more recent benchmark sets (typical of modern optimization but not necessarily deep learning) I'd be interested in lending a hand.

gdalle avatar Jun 14 '24 20:06 gdalle

Something like this maybe: https://github.com/benchopt/benchopt?tab=readme-ov-file#available-benchmarks

gdalle avatar Jun 14 '24 20:06 gdalle

@gdalle Feel free to open a new issue if you have suggestion of data set or problems to be implemented here, for instance like this https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/issues/116 .

tmigot avatar Jun 17 '24 22:06 tmigot

Done in #333, I think we can close this one

gdalle avatar Jun 18 '24 04:06 gdalle