DistributionsAD.jl
DistributionsAD.jl copied to clipboard
Gradients of logpdf with TuringDiagMvNormal return nothing
This puzzles me a bit
using DistributionsAD, Distributions, Flux
using DistributionsAD: TuringDiagMvNormal
Flux.@functor TuringDiagMvNormal
m = [1.0]
S = [0.1]
f = TuringDiagMvNormal(m,S)
x = 5 .+ randn(1,100)
ps = Flux.params(f)
#loss() = sum(logpdf(f,x)) # this returns nothings the loop below
loss() = loglikelihood(f,x) # this works fine
gs = Flux.gradient(loss, ps)
for p in ps
display(gs[p])
end
I think the logpdf
is not hit during the gradient
computation somehow. adding an error()
in here still just returns nothing
s...
This puzzles me a bit
Welcome to the club! It could be a Zygote bug.
Try making a MWE
I think this is related to https://github.com/FluxML/Zygote.jl/issues/692 / https://github.com/FluxML/Zygote.jl/issues/522, because this works:
loss(m,S) = sum(logpdf(TuringDiagMvNormal(m,S),x))
gs = Zygote.gradient(loss, m, S)