pints
pints copied to clipboard
Add Model wrapper that uses Gaussian processes to evaluate
Possibly based on GPy or GPflow
I can see three options here:
- Use GPflow or GPy (I'm leaning towards GPy, since it doesn't have tensorflow as a dependency). This is the easiest, but adds another dependency. All that probably would be required is writing an example showing how to use GPy/GPflow to emulate a pints model
- Write code to implement a GP model in pints. Myself and @sanmitraghosh are planning to do this for a paper, but the code would be (partially) in C++, so would require adding C++ compilation functionality back into pints
- Same as 2, but split this code into a separate repository in the pints organisation, then the main pints code can add this as a dependency. This might be a pain to develop both code-bases in parallel....
I'm leaning towards (1) for the short term, leading towards (2) in the longer term if the paper with @sanmitraghosh shows some good result for our approach. Any other thoughts?
I've got some good results at using Aboria to solve for the GP, so I'm going to experiment with fitting a GP using this.
@martinjrobins Great news !
I'll let you know how it goes. I'm going to stick to the matern 3/2 kernel for the moment, as that seems to be the easiest to solve (Gaussian I haven't managed to do efficiently for dimensions higher than 3)
I think you need not worry about Gaussian kernel, matern is much more useful in practical applications.
couple of implementation points, I'm planning to:
- implement the Adam stochastic gradient descent algorithm in pints, to use for fitting the GP (might be related to #764)
- the GP itself will be derived from pints.log_pdf. Be nice to use it to fit to a posterior from mcmc samples, then use that as a prior for subsequent inference :)
- for the GP fitting (using Adam), I don't want to evaluate the function to be minimised, only the sensitivities, so I'm implementing Adam so you don't need to evaluate the function. This might be a bit unusual in the pints framework (e.g I'm using abs(sensitivities) instead of fbest).
- also for the GP fitting, the gradients of the function are noise-corrupted, so the stopping criteria shouldn't be based on the minimum abs(gradient) achieved, but when the change in hyper-parameters have converged
Hey @martinjrobins what's the status of this? Is it still within scope for PINTS?
@martinjrobins can we close this? Seems like it would be a project using PINTS, rather than a part of PINTS? Could even be something that lives in another pints-team
repo?