pints
pints copied to clipboard
Visualising the relationships between samplers and optimisations
As I have banged on about, our job in Pints is partly educational. I created this d3 visualisation of our wish list of the samplers and optimisers which, to me at least, helped clarify things a bit (I know this information is here too).
Let me know what you think of this. After a bit of work, it'd be good to try and include a link to this on the Pints homepage (and possibly a static, less comprehensive, representation on the homepage itself).
One thing that it does make clear is the work ahead of us!
Great! There's a bunch more optimisation methods if you allow derivatives as well, but I don't think we should aim for completeness (just usefulness for time-series problems!)
Do you have a reference for those optimisation methods? I’m trying to get my head around them all...
Or a book recommendation for optimisation in general?
On 6 Apr 2018, at 10:49, Michael Clerx [email protected] wrote:
Great! There's a bunch more optimisation methods if you allow derivatives as well, but I don't think we should aim for completeness (just usefulness for time-series problems!)
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.
Not really! There's meant to be a good book involving kangaroo methaphors somewhere, but I've never found out who the author is :D
I've got 2 books in my office you could have a look at if you like. One is a classic old-school derivative-free book by Brent (pretty famous optimiser). The other is an introduction to derivative-free methods, but doesn't really touch on anything beyond nelder-mead and trust-region reflective methods. Both of these books are by authors who like having convergence proofs and such, and so don't mention evolutionary methods.
I've added a few methods here: https://en.wikipedia.org/wiki/Derivative-free_optimization because a few years ago this page was just 5 notices about "stubs" :-)
https://en.wikipedia.org/wiki/Mathematical_optimization#Heuristics
Palpable disdain :-p
This is another big name: https://en.wikipedia.org/wiki/Powell%27s_method And here's his post-humous software page: http://mat.uc.pt/~zhang/software.html#powell_software
In my experience so far, most mathematically sound methods fail on the kind of problems we're working on, unless you start very close to the solution. One strategy then is to use a 'dirty' optimiser like PSO for a global optimisation, and then refine using e.g. Nelder-Mead. However: CMAES (and the other ES methods) do a great job of both global and local optimisation so we don't really need to add these traditional optimisers, unless we want to do a major comparison
Cheers for all these -- just going through these references now!
@ben18785 should this still be a ticket? Or shall we save this idea for future student projects, papers, etc?