h2o-3
h2o-3 copied to clipboard
Bayesian Optimization - Research
Sub-issue of https://github.com/h2oai/h2o-3/issues/6516
This issue is meant to track papers and frameworks we might want to research. (In random order)
Papers
- [ ] A Tutorial on Bayesian Optimization (https://arxiv.org/abs/1807.02811) [Seb]
- [ ] Cornell University wiki (https://optimization.cbe.cornell.edu/index.php?title=Bayesian_Optimization)
- [ ] Practical Bayesian Optimization of Machine Learning Algorithms (https://proceedings.neurips.cc/paper/2012/file/05311655a15b75fab86956663e1819cd-Paper.pdf)
- [ ] Bayesian Optimization Book (https://bayesoptbook.com/) [Seb]
- [ ] Gaussian Processes (not a BO but often used by BO) (https://gaussianprocess.org/gpml/chapters/RW.pdf)
- [ ] A Comparative study of Hyper-Parameter Optimization Tools (https://arxiv.org/pdf/2201.06433.pdf) [Seb]
- [ ] RoBo (https://ml.informatik.uni-freiburg.de/wp-content/uploads/papers/17-BayesOpt-RoBO.pdf)
- [ ] Fast Bayesian Optimization of Machine Learning Hyperparameters on Large Datasets (https://arxiv.org/abs/1605.07079)
- [ ] Recent Advances in Bayesian Optimization (https://arxiv.org/abs/2206.03301) [Seb]
Frameworks
- [ ] BOTorch (http://arxiv.org/abs/1910.06403; https://ax.dev/ ; https://botorch.org/)
- [ ] NeverGrad (https://facebookresearch.github.io/nevergrad/)
- [ ] HyperOpt (https://hyperopt.github.io/hyperopt/) [Seb]
- [ ] Optuna (https://optuna.org/) [Seb]
- [ ] SkOpt (https://scikit-optimize.github.io/stable/) [Seb]
- [ ] GPyOpt (https://sheffieldml.github.io/GPyOpt/)
- [ ] "Bayesian Optimization" library (https://github.com/bayesian-optimization/BayesianOptimization)
- [ ] Ray Tune (https://docs.ray.io/en/latest/tune/index.html)
- [ ] SMAC (https://github.com/automl/SMAC3) [Seb]
- [ ] pyGPGO (https://github.com/josejimenezluna/pyGPGO)
- [ ] Implementation in Pyro (PyTorch-based prob. prog. language) (https://pyro.ai/examples/bo.html)
- [ ] RoBO (https://github.com/automl/RoBO)
You forgot to add your thesis to the list.
@wendycwong My thesis contains a short summary of some of the papers but I don't think it would be useful as primary source but it can be useful for getting some general idea about BO.
It should be available here. You can skip straight to the section 1.5 Hyperparameter optimization (page 23 (39th page of the pdf)).
@tomasfryda, @wendycwong I added my name next to articles and frameworks I plan to read/look at in priority. Feel free to do the same: it's just for info, and multiple people can look at the same paper/framework, and we can divide later the remaining ones if they're worth investigation.