lava-optimization
lava-optimization copied to clipboard
Improve SolverTuner logic and unit tests
Objective of pull request: Major update to the SolverTuner
utility class.
Pull request checklist
Your PR fulfills the following requirements:
- [x] Tests are part of the PR (for bug fixes / features)
- [x] Docs reviewed and added / updated if needed (for bug fixes / features)
- [x] PR conforms to Coding Conventions
- [x] PR applys BSD 3-clause or LGPL2.1+ Licenses to all code files
- [x] Lint (
pyb
) passes locally - [x] Build tests (
pyb -E unit
) or (python -m unittest
) passes locally
Pull request type
Please check your PR type:
- [x] Feature
New Behavior
- The new API allows the user to specify a fitness function that evaluates the current set of hyper-parameters, based on final cost and number of steps to solution. This is the function that is maximized by the
SolverTuner
. - The behavior of
SolverTuner
is now fully deterministic, thanks to theseed
argument. -
SolverTuner
now tracks all the hyper-parameters in a structured numpy array accessible with theSolverTuner.get_results()
method.
Does this introduce a breaking change?
- [x] Yes
One additional point that came to my attention while Chinonso and I were tuning hyperparameters: In many cases, we don't want to search through a grid of hyperparameters. Instead, we would like to search through a set of sets of hyperparameters, i.e. we would like to try the combinations x = 1 and y = 2 x = 5 and y = 8 but not e.g. x=1 and y=8.
Could you add this functionality? Would be very helpful!