edward
edward copied to clipboard
importance sampling and sequential monte carlo
Hi @dustinvtran, I've been going through the codebase to figure out how to best implement SMC, and was looking for a little validation. Would subclassing from MonteCarlo and drawing inspiration from the Metropolis-Hastings implementation be the right way to go about it?
Yep, I think that makes sense. The simplest version to get started would follow Metropolis-Hastings' build_update()
but without the Metropolis correction. This would implement vanilla importance sampling.
Hi, I'm working on an implementation of IS (a first step before SMC) and I was thinking of the two following options:
-
Implement an
ImportanceSampling
class inheriting fromMonteCarlo
. Thebuild_update()
method would simply compute one sample from the prior and its the importance weight (likelihood ratio). Each iteration would store its result internally and only in populate the user'sEmpirical
in thefinalize()
method by sampling the auto-normalized weighted approximation. -
Implement a
WeightedEmpirical
distribution, makeEmpirical
inherit from it (with default weights of1/N
) and replace inMonteCarlo
theEmpirical
requirement byWeightedEmpirical
. Then at each iteration directly populate the user'sWeightedEmpirical
and auto-normalize the distribution in thefinalize()
method.
Which of these two options do you think make more sense? Or would you have any other suggestions / comments?
I'm not a contributor, so take this with a grain of salt, but a WeightedEmpirical
distribution makes more sense to me than keeping the weights internal to ImportanceSampling
. It's convenient to be able to access the underlying samples & weights for approximating integrals without experiencing another round of sampling error.