Stock-Prediction-Models icon indicating copy to clipboard operation
Stock-Prediction-Models copied to clipboard

Neuroevolution with Elite Selection

Open marvin-hansen opened this issue 5 years ago • 2 comments

Bayesian optimization can quickly become computationally expensive, just increase population, network size, or both. On the other hand, a lot of agent optimization can be done within the evolutional strategy, so why not generate a population, select the "elite" agent, the one with the highest score, and take that one as a blueprint to explore the parameter set further through mutation?

Example:

https://github.com/paraschopra/deepneuroevolution/blob/master/openai-gym-cartpole-neuroevolution.ipynb

The example above converges very fast to a global optimum (in about 50 generations), but more importantly, it requires a lot less computational power and thus allows super fast model re-generation.

marvin-hansen avatar May 13 '19 13:05 marvin-hansen

This is really good, i will do implement for it

huseinzol05 avatar May 14 '19 07:05 huseinzol05

Thanks for the time & effort to dig deeper into this!

BTW, the optuna optimizer seems to faster and easier to use:

https://optuna.org/#key_features https://github.com/pfnet/optuna/tree/master/examples

marvin-hansen avatar May 14 '19 08:05 marvin-hansen