mlr3mbo
mlr3mbo copied to clipboard
feat: add asynchronous decentralized bayesian optimization
- [ ] think about introducing async loop function that would allow
OptmizerAdbo
to behave similar asOptimizerMbo
- [ ] If not moving forward with async loop function, maybe still allow
OptimizerAdbo
to be configured w.r.t surrogate and acquisition function optimizer. - [ ] Note that update logic w.r.t missing values was dropped in #146 and should be moved to
OptimizerAdbo
or async loop functions directly
- the Surrogate learner code was already slightly underdocumented, the PR adds even more code here. It is not documented at all what this does. I think this is for imputing points which are currently under evaluation. I am not sure if we should mix that in.... at the very least this needs to be documented and commented
-
the SurrogateLearner is now tightly coupled with rush, and its imputation for async optim. I think this is not good design. Inherit from SL, create SLAsync. only do the impute there, don't touch the existing class.
-
we have to handle failed stated. but MBO should ALREADY cover that? just use the same mechanism.
-
we might slightly change the impute mechanism. we could a) sample between min/max b) instead of mean provide a "quantile value" which is constantly used (e.g. 0.5 would always use median). in any case, a reference should be provided?
- make the interface of ADBO more "similar" to OptimizerMBO. the SL should be passable as a generic model.
- there are quite a few option for eps decay. is that now also in general handable in MBO? we should pass in an ACQF