Michael Clerx
Michael Clerx
Perfect! I'm thinking `method_hyper_parameters` could eventually be a dict rather than an array, but that's a [separate issue](#1320)
Related, for optimisers: - https://www.sciencedirect.com/science/article/abs/pii/S156849462030675X - https://numbbo.github.io/coco/
From a quick glance it seems the main difference is that single output problems return data of the shape ``(n_times,)``, rather than ``(n_times, n_outputs)``. For sensitivities it's ``(n_times, n_parameters)`` instead...
Great! There's a bunch more optimisation methods if you allow derivatives as well, but I don't think we should aim for completeness (just usefulness for time-series problems!)
Not really! There's meant to be a good book involving kangaroo methaphors somewhere, but I've never found out who the author is :D I've got 2 books in my office...
I've added a few methods here: https://en.wikipedia.org/wiki/Derivative-free_optimization because a few years ago this page was just 5 notices about "stubs" :-)
https://en.wikipedia.org/wiki/Mathematical_optimization#Heuristics Palpable disdain :-p
This is another big name: https://en.wikipedia.org/wiki/Powell%27s_method And here's his post-humous software page: http://mat.uc.pt/~zhang/software.html#powell_software
In my experience so far, most mathematically sound methods fail on the kind of problems we're working on, unless you start very close to the solution. One strategy then is...
@ben18785 should this still be a ticket? Or shall we save this idea for future student projects, papers, etc?