pints icon indicating copy to clipboard operation
pints copied to clipboard

Add Model wrapper that uses Gaussian processes to evaluate

Open MichaelClerx opened this issue 7 years ago • 8 comments

Possibly based on GPy or GPflow

MichaelClerx avatar Oct 05 '17 13:10 MichaelClerx

I can see three options here:

  1. Use GPflow or GPy (I'm leaning towards GPy, since it doesn't have tensorflow as a dependency). This is the easiest, but adds another dependency. All that probably would be required is writing an example showing how to use GPy/GPflow to emulate a pints model
  2. Write code to implement a GP model in pints. Myself and @sanmitraghosh are planning to do this for a paper, but the code would be (partially) in C++, so would require adding C++ compilation functionality back into pints
  3. Same as 2, but split this code into a separate repository in the pints organisation, then the main pints code can add this as a dependency. This might be a pain to develop both code-bases in parallel....

I'm leaning towards (1) for the short term, leading towards (2) in the longer term if the paper with @sanmitraghosh shows some good result for our approach. Any other thoughts?

martinjrobins avatar Jan 09 '18 10:01 martinjrobins

I've got some good results at using Aboria to solve for the GP, so I'm going to experiment with fitting a GP using this.

martinjrobins avatar Mar 19 '19 04:03 martinjrobins

@martinjrobins Great news !

sanmitraghosh avatar Mar 19 '19 11:03 sanmitraghosh

I'll let you know how it goes. I'm going to stick to the matern 3/2 kernel for the moment, as that seems to be the easiest to solve (Gaussian I haven't managed to do efficiently for dimensions higher than 3)

martinjrobins avatar Mar 19 '19 11:03 martinjrobins

I think you need not worry about Gaussian kernel, matern is much more useful in practical applications.

sanmitraghosh avatar Mar 19 '19 12:03 sanmitraghosh

couple of implementation points, I'm planning to:

  1. implement the Adam stochastic gradient descent algorithm in pints, to use for fitting the GP (might be related to #764)
  2. the GP itself will be derived from pints.log_pdf. Be nice to use it to fit to a posterior from mcmc samples, then use that as a prior for subsequent inference :)
  3. for the GP fitting (using Adam), I don't want to evaluate the function to be minimised, only the sensitivities, so I'm implementing Adam so you don't need to evaluate the function. This might be a bit unusual in the pints framework (e.g I'm using abs(sensitivities) instead of fbest).
  4. also for the GP fitting, the gradients of the function are noise-corrupted, so the stopping criteria shouldn't be based on the minimum abs(gradient) achieved, but when the change in hyper-parameters have converged

martinjrobins avatar Mar 20 '19 05:03 martinjrobins

Hey @martinjrobins what's the status of this? Is it still within scope for PINTS?

MichaelClerx avatar Mar 31 '20 16:03 MichaelClerx

@martinjrobins can we close this? Seems like it would be a project using PINTS, rather than a part of PINTS? Could even be something that lives in another pints-team repo?

MichaelClerx avatar Oct 31 '23 10:10 MichaelClerx