Turing.jl icon indicating copy to clipboard operation
Turing.jl copied to clipboard

Integrating out variables via Laplace approximation (or other methods)

Open ElOceanografo opened this issue 3 years ago • 3 comments

This is an idea/feature request that I've been thinking about. Relates to this conversation on Discourse, and https://github.com/TuringLang/Turing.jl/issues/1340 . See also this paper. How hard would it be to implement an interface for integrating out variables in a Turing model, using for example the Laplace approximation? This is used in TMB to efficiently fit models with mixed effects or other latent variables/fields, and I think it would be a great addition.

I'm picturing being able to do something like this:

@model function latent_model(x, z)
    a ~ SomeDistribution(1, 2) # parameter
    mu = x * a
    y ~ AnotherDistribution(mu) # latent variable
    z ~ YetAnotherDistribution(y) # likelihood
end

mod = latent_model(xdata,  zdata)
mod_marginal = marginalize(mod, LaplaceApprox(:y))
# ...
chn = sample(mod, sampler, nsamples)
opt = optimize(mod_marginal, MLE()) # fit `a` while integrating out `y`

Does this seem in principle desirable/doable? If so I'd be willing to do some work on it, given a couple pointers on where to get started...

ElOceanografo avatar Aug 20 '20 02:08 ElOceanografo

Also related to https://github.com/TuringLang/Turing.jl/issues/976

devmotion avatar Aug 20 '20 06:08 devmotion

And the proposal in https://github.com/TuringLang/DynamicPPL.jl/issues/94

devmotion avatar Aug 20 '20 13:08 devmotion

Or, alternatively, the interface could be something like this, more along the lines of how Gibbs works for samplers:

scheme = MarginalOptimizer(LBFGS(:a), LaplaceApprox(:y))
opt = optimize(mod, MLE(), scheme)

ElOceanografo avatar Oct 07 '20 19:10 ElOceanografo

This looks like quite hard to do well for general cases, so it might make sense to implement it in a separate library that depends on DynamicPPL/Turing.

yebai avatar Nov 12 '22 20:11 yebai