AdvancedMH.jl icon indicating copy to clipboard operation
AdvancedMH.jl copied to clipboard

MALA implementation results in the wrong covariance

Open Red-Portal opened this issue 1 year ago • 0 comments

import AdvancedMH, LogDensityProblems, Distributions

struct TheNormalLogDensity{M}
    A::M
end

# can do gradient
LogDensityProblems.capabilities(::Type{<:TheNormalLogDensity}) = LogDensityProblems.LogDensityOrder{1}()

LogDensityProblems.dimension(d::TheNormalLogDensity) = size(d.A, 1)
LogDensityProblems.logdensity(d::TheNormalLogDensity, x) = -x' * d.A * x / 2

function LogDensityProblems.logdensity_and_gradient(d::TheNormalLogDensity, x)
    return -x' * d.A * x / 2, -d.A * x
end

Σ = [1.5 0.35; 0.35 1.0]
σ² = 0.5
spl = AdvancedMH.MALA(g -> Distributions.MvNormal((σ² / 2) .* g, σ² * I))
#spl = RWMH(MvNormal(zeros(2), I))


chain = AdvancedMH.sample(TheNormalLogDensity(inv(Σ)), spl, 500000; initial_params=ones(2))
data = stack([c.params for c = chain])
cov(data, dims=2)

Someone pointed out that MALA results in samples that have a wrong covariance. It is not a mixing/variance issue since I'm hitting an acceptance rate of 0.57. For the above example, I get:

julia> cov(data, dims=2)
2×2 Matrix{Float64}:
 0.500508  0.115434
 0.115434  0.332796

I didn't have the time to look into too much detail, but it seems that RWMH returns the correct covariance, while only MALA does not. For instance, RWMH yields:

julia> cov(data, dims=2)
2×2 Matrix{Float64}:
 1.47741   0.344062
 0.344062  0.995041

Red-Portal avatar Feb 13 '24 21:02 Red-Portal