Anand

Results 22 issues of Anand

- [ ] Plot average performance vs iteration for all optimizers (#44) - [ ] Each optimizer should be run for multiple starting points and average fitness reported - [...

Implementation [available in scikit-learn](http://scikit-learn.org/stable/modules/generated/sklearn.manifold.TSNE.html)

enhancement

Which is not known in most cases. It should be defined as the delta change in fitness where the optimization is stopped.

To be consistent with the rest of the optimizers

E.g. strings, or just a list of possible values.

We want the step size to be a hyper-parameter of the optimizee rather than an optimizer hyper-parameter, since each optimizee has different requirements, and may even have different required step...

How do we construct such functions? (references)

Should it be a weighted sum (as it is currently) or a lexicographic ordering?

low-priority