keras-tuner icon indicating copy to clipboard operation
keras-tuner copied to clipboard

Add TPE Optimizer

Open JonnoFTW opened this issue 5 years ago • 9 comments

There is a similar library called hyperopt and the main optimizer included in that is TPE (Tree-structured Parzen Estimator) as described by Bergstra et al. They also provide a fairly comprehensive set of distributions to optimize over that might be inspiring.

Would it be possible to get that optimiser implemented here? I've used it with keras through another library with very good results.

JonnoFTW avatar Jun 12 '19 04:06 JonnoFTW

Im trying to understand how this is implemented. Is there a better implementation than this: https://github.com/hyperopt/hyperopt/blob/master/hyperopt/tpe.py

grahamannett avatar Jun 13 '19 04:06 grahamannett

@grahamannett unfortunately, I can't find another implementation. Perhaps the paper might be more informative.

JonnoFTW avatar Jun 13 '19 05:06 JonnoFTW

This might help:

http://neupy.com/2016/12/17/hyperparameter_optimization_for_neural_networks.html

bapalto avatar Jun 22 '19 08:06 bapalto

@JonnoFTW We don't currently have plans to implement this algorithm here but it's something we are considering for the future

If anyone is interested in implementing this algorithm as a subclass of the Oracle class and can provide some benchmarks to show it beating the existing algorithms for some subset of NN problems, please open a PR!

Marking as "contributions welcome"

omalleyt12 avatar Oct 10 '19 00:10 omalleyt12

I might look at implementing this over the weekend. No promises or anything, but I'm definitely intrigued and have my ears perked up at attention (figuratively, obviously 😅). If anyone is interested in a team attempt then I am more than down and love working with people, so please email me or reply here or whatever. Ciao! (Literally never used that as a closer before)

tarasivashchuk avatar Nov 01 '19 16:11 tarasivashchuk

@JonnoFTW We don't currently have plans to implement this algorithm here but it's something we are considering for the future

If anyone is interested in implementing this algorithm as a subclass of the Oracle class and can provide some benchmarks to show it beating the existing algorithms for some subset of NN problems, please open a PR!

Marking as "contributions welcome"

@omalleyt12 Would you suggest I wait until your team finalizes the Oracle class for 1.0, or will the changes not be major and you think I am fine to implement it now?

Thank you!

tarasivashchuk avatar Nov 01 '19 16:11 tarasivashchuk

@tarasivashchuk Great! We just released v1 and the Oracle class is now finalized and stable for subclassing

Note that to have this algorithm merged into this repo you should provide a few real-life examples of where the TPE Oracle converges significantly faster on a good solution than the existing Oracles (for TPE, showing cases where it beats the BayesianOptimizationOracle should be good enough).

To that end, it's worth taking a look at how HyperParameters.conditional_scope works, because I suspect TPE will perform best against our existing optimizers in places where the hyperparameter space has a lot of conditional hyperparameters (i.e. hyperparameters that are known to only be relevant when another hyperparameter has a certain value).

Feel free to reach out here with any questions or if you get stuck, happy to help 😄

omalleyt12 avatar Nov 01 '19 17:11 omalleyt12

@tarasivashchuk I am interested in participating! Is there any progress so far?

zhuyizheng avatar Nov 14 '19 07:11 zhuyizheng

I am really surprised that Keras Tuner does not have TPE optimization yet. I don't want to be rude but please just search for the keyword in google scholar.

statcom avatar May 15 '20 00:05 statcom