JSOSolvers.jl icon indicating copy to clipboard operation
JSOSolvers.jl copied to clipboard

Use SolverParameters

Open tmigot opened this issue 1 year ago • 10 comments

This is a first draft of integrating SolverParameters.jl in the JSO-compliant solver.

It is more constraints but clarifies the role of algorithmic parameters (their default values, and their domain -> so far we never really had a check on the domain).

I believe this structure should be in the solver for one reason well illustrated here that the solver can need the parameters.

This is minimal change in the code as in the beginning we will extract the parameters from the solver structure.

What do change is that we can no longer call solve!(solver, nlp, stats; with_some_parameters = 1) with parameters. (although, we could update the values in the solver from the keyword arguments, but probably not a good practice)

This also opens new perspectives. For instance, the LbfgsParameterSet could contains parameters for a recipe to compute mem instead of just mem (if we plan something that depend on the size of the problem for instance).

What is not done here is where we store the default values. There should be isolated, so we can modify them easily, and also they might depend on the solver precision... An optimal τ₁ (below) is probably different in Float16 and in Float64.

tmigot avatar Feb 09 '24 13:02 tmigot

Codecov Report

Attention: Patch coverage is 90.90909% with 1 line in your changes missing coverage. Please review.

Project coverage is 88.48%. Comparing base (1e1368e) to head (cf6fd9a). Report is 6 commits behind head on main.

:exclamation: Current head cf6fd9a differs from pull request most recent head 4ebed83

Please upload reports for the commit 4ebed83 to get more accurate results.

Files Patch % Lines
src/lbfgs.jl 90.90% 1 Missing :warning:
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #254      +/-   ##
==========================================
+ Coverage   86.66%   88.48%   +1.81%     
==========================================
  Files           7        7              
  Lines        1125     1033      -92     
==========================================
- Hits          975      914      -61     
+ Misses        150      119      -31     

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

codecov[bot] avatar Feb 09 '24 13:02 codecov[bot]

Package name latest stable
FletcherPenaltySolver.jl
Percival.jl

github-actions[bot] avatar Feb 09 '24 14:02 github-actions[bot]

Here are a few minor comments. ~I don’t understand why the parameters should be inside the solver object?!~ (nevermind, I see).

It could actually, but we would need a constructor that takes both nlp and params, and also have a new solve!with 4 arguments (solver, nlp, params and stats) with all its variants. I find it more economical in term of modification to store it in the solver.

tmigot avatar Feb 10 '24 16:02 tmigot

Would it make sense to also add (not sure how) the constraints on the parameters, for instance τ₁>0, or η1<η2 for R2. I'm thinking about MultiPrecisionR2 where there is a bunch of constraints of this kind. We could also implement a generic test function that checks if the user-provided parameters satisfied the constraints.

d-monnet avatar Mar 05 '24 23:03 d-monnet

Everything regarding the constraints on the domain of the parameters should be modeled in the parameter set, the package SolverParameters.jl has a bunch of functionalities for this (even open sets). For the constraints, linking the different parameters, I think that we don't have a real solution. I would strongly suggest to add as a test in the <: AbstractParameterSet constructor, and then in the BBModels if that's the solution we keep for optimizing the parameters.

We reimplemented the Julia in function https://github.com/JuliaSmoothOptimizers/SolverParameters.jl/blob/8cea36a64b2d7bf781d1c3cb161ff70237876c00/src/parameterset.jl#L45 so that should partially do the test.

tmigot avatar Mar 06 '24 21:03 tmigot

Wouldn't constraints be modeled directly in the BBNLPModel?

dpo avatar Mar 06 '24 21:03 dpo

Wouldn't constraints be modeled directly in the BBNLPModel?

Mainly yes, but some minimal constraints are also handled here. Beyond the optimization of parameters, it is also good to make sure that the given parameters are not violating the basic constraints.

tmigot avatar Mar 06 '24 21:03 tmigot

I noticed that trunk and tron interfaces do not allow to pass parameters to the sub-solvers (e.g. increase_threshold parameter for TRONTrustRegion), and don't have them as keyword arguments. With the current approach, I'm not sure what solvers' parameters we would optimize, since it is rather the sub-solvers' parameters that should be optimized.

d-monnet avatar Mar 08 '24 22:03 d-monnet

I noticed that trunk and tron interfaces do not allow to pass parameters to the sub-solvers (e.g. increase_threshold parameter for TRONTrustRegion), and don't have them as keyword arguments. With the current approach, I'm not sure what solvers' parameters we would optimize, since it is rather the sub-solvers' parameters that should be optimized.

Good point. Can you create a separate issue to do this? As you noticed this optimization is sort of new and there several parameters we didn't extract from the subsolvers.

tmigot avatar Mar 09 '24 00:03 tmigot

@dpo @d-monnet I finally updated this PR! The default parameters are now centralized at the beginning of each file, and include default value documentation so we would no longer need to update the whole file if default values change. Moreover, there is no a mechanism that allow building default value depending on the AbstractNLPModel.

The DefaultParameter is something I just added in SolverParameters.jl https://github.com/JuliaSmoothOptimizers/SolverParameters.jl/blob/main/src/default-parameter.jl .

Right now constraints between the parameters are handle by an @assert in the parameter constructor. I suggest to add a smarter mechanism in a follow-up PR.

Finally, I open a PR in NLPModels.jl (https://github.com/JuliaSmoothOptimizers/NLPModels.jl/pull/481) to write eltype(nlp) instead of eltype(nlp.meta.x0). It was a bit annoying, but apparently we can't write anonymuous functions with parametric types.

tmigot avatar Aug 15 '24 11:08 tmigot