AdvancedMH.jl icon indicating copy to clipboard operation
AdvancedMH.jl copied to clipboard

Adaptive proposals in Turing composite Gibbs sampler

Open arzwa opened this issue 3 years ago • 8 comments

Hi there, a question related to this PR

I'm currently facing an application where I would really like to use adaptive proposals like those defined in this PR in a Metropolis-within-Gibbs setting (i.e. we have a parameter vector x, for each parameter have an adaptive univariate proposal, and in each iteration of the MCMC sampler we update each component of the parameter vector conditional on the others using a Metropolis-Hastings step). The Turing-way to go would seem to use the stuff implemented in AdvancedMH in a Turing composite Gibbs sampler (something roughly like Gibbs(AdaptiveMH(:p1), AdaptiveMH(:p2), ...) where the p1, p2, ... are the parameter vector components)? I think in general this is worthwhile for low-dimensional applications where the gradient of the loglikelihood is really costly or unavailable. I wonder what would be the best way to proceed to allow this? Thanks for any hints!

arzwa avatar Dec 08 '20 16:12 arzwa

I've got a couple of thoughts here:

  1. It's not clear to me that any Gibbs stuff should live in AdvancedMH -- it's kind of a scope overreach. That said, we do have emcee in here, which isn't quite MH but does use an MH step, as many algorithms do. I would imagine that a better solution is AdvancedGibbs or something, which would be Turing-free Gibbs sampling using the AbstractMCMC interface.
  2. Is there a reason you can't use Turing? Do you already have the likelihood function written up (a common case, just asking)?

If we were to support it, I don't think it would be easy, since you'd have to be a little more specific with how all the proposals are generated. The issue is that AdvancedMH supports all kinds of proposal styles, and it doesn't actually know what counts as a "parameter" or have any such notion of incrementing individual parameters. For example, if all you're doing is sampling from an n vector with a multivariate normal proposal, you know you can just go through each element. It's far less clear how to do this with NamedTuple proposals (for instance), but I suspect there might be an easy recursive solution here.

cpfiffer avatar Dec 08 '20 16:12 cpfiffer

Hmm, either I don't understand, or you misunderstood my question. I am exactly interested in using Turing and the Gibbs sampler implemented there. My question is rather on how I would be able (or what I should implement) to use the MH sampler with an adaptive proposal as a Gibbs component with a Turing model?

arzwa avatar Dec 08 '20 16:12 arzwa

OK, I had some trouble finding the right docs in Turing, it seems that when that PR is merged, it should be possible to do something like the following right?

sampler = Gibbs(
    MH(:v1=> AdvancedMH.AdaptiveProposal()), 
    MH(:v2=> AdvancedMH.AdaptiveProposal()))
chain = sample(model, sampler, 1000)

which is what I have in mind, hope that makes it clearer.

arzwa avatar Dec 08 '20 17:12 arzwa

Ah, I understand, sorry.

There'll need to be some changes on Turing's side before that will work for variables that are in a constrained space (not -Inf to Inf), but it should function reasonably well right off the bat if v1 and v2 are unconstrained and continuous.

cpfiffer avatar Dec 09 '20 03:12 cpfiffer

What is the status on adaptive metropolis proposals? Happy to help if needed

fipelle avatar Oct 25 '22 13:10 fipelle

As far as I am aware, there's not any active development. Would love a push to get it in! There's two PRs for adaptive methods (#39 and #57). Both are really great but just need someone to revisit them and do the wrap up.

cpfiffer avatar Oct 25 '22 16:10 cpfiffer

@cpfiffer: thanks for the reply! I will take a look. I am working on an ABC implementation of Turing and it would be handy to have it.

fipelle avatar Oct 25 '22 17:10 fipelle

Awesome! Let me know if you need any pointers.

cpfiffer avatar Oct 25 '22 18:10 cpfiffer