MeasureBase.jl icon indicating copy to clipboard operation
MeasureBase.jl copied to clipboard

Pushforward problems

Open cscherrer opened this issue 1 year ago • 4 comments

Say we have

using MeasureTheory, MeasureBase, BenchmarkTools, AffineMaps
μ = Normal(3.0, 2.0)
ν = pushfwd(MulAdd(2.0, 3.0), StdNormal())
x = rand(Normal(3.0, 2.0) ^ 100);

Then benchmarking gives

julia> @btime logdensityof($μ  ^ 100, $x)
  60.594 ns (0 allocations: 0 bytes)
-222.1504504107993

julia> @btime logdensityof($ν ^ 100, $x)
  12.734 μs (12 allocations: 192 bytes)
-222.15045041079918

I'm also confused by the basemeasure sequence here:

julia> basemeasure_sequence(ν) .|> println;
PushforwardMeasure(MulAdd{Float64, Float64}(2.0, 3.0), StdNormal())
PushforwardMeasure(MulAdd{Float64, Float64}(2.0, 3.0), 0.3989 * MeasureBase.LebesgueBase())
PushforwardMeasure(MulAdd{Float64, Float64}(2.0, 3.0), MeasureBase.LebesgueBase())

I'd think there are always a couple of laws in place around these things:

logdensity_def(μ, x) == logdensity_rel(μ, basemeasure(μ), x)

logdensity_rel(pushfwd(f, μ), pushfwd(f, ν), x) == logdensity_rel(μ, ν, inverse(f)(x))

But then we'd expect logdensityof(ν, x) == logdensityof(StdNormal(), x), which is not the case.

@oschulz any idea what's going on here?

cscherrer avatar Aug 17 '23 21:08 cscherrer

I think I have at least a partial approach that may help. First, we enforce laws

# Law 1
logdensity_def(μ, x) == logdensity_rel(μ, basemeasure(μ), x)

# Law 2
logdensity_rel(pushfwd(f, μ), pushfwd(f, ν), x) == logdensity_rel(μ, ν, inverse(f)(x))

Then we add some new smart constructors:

  • The pushforward of a weighted measure is a weighted pushforward
  • A scalar affine pushforward of Lebesgue measure is a weighted Lebesgue measure

Then in the above basemeasure sequence, instead of

pushfwd(MulAdd(2.0, 3.0), 0.3989 * LebesgueBase())

we'd get

0.3989 * pushfwd(MulAdd(2.0, 3.0), LebesgueBase())

which further reduces to

0.19945 * LebesgueBase()

So the full basemeasure sequence of the pushforward would be

pushfwd(MulAdd(2.0, 3.0), StdNormal())
0.19945 * LebesgueBase()
LebesgueBase()

I especially like that this makes it easier (I think) to work with the logjac - we need only compute it once, at the same time we take the log-density relative to LebesgueBase().

cscherrer avatar Aug 21 '23 21:08 cscherrer

Let me think this over for a day :-)

oschulz avatar Aug 21 '23 23:08 oschulz

For later reference: https://math.stackexchange.com/questions/4412857/radon-nikodym-derivative-and-push-forward-of-measures

cscherrer avatar Aug 22 '23 20:08 cscherrer

I think for the general case (opaque f without known mathematical properties) we need to define

@inline basemeasure(ν::PushforwardMeasure) = basemeasure(parent(ν))

And require that the shape of y and x are the same, so require that f is an automorphism, and check that in logdensity_def.

For specific kinds of f, like affine transformations, we can specialize.

oschulz avatar Sep 04 '23 20:09 oschulz