Distributions.jl
Distributions.jl copied to clipboard
Generic numerical fallback for `entropy`
I noticed that the numerical fallback for entropy is currently commented out.
Since there is a numerical integration procedure which works for expectation, I wonder whether this should be activated.
For instance, we cannot currently do entropy(NegativeBinomial(3, 0.3)) at all, but
-Distributions.expectation(x->logpdf(NegativeBinomial(3, 0.3),x), NegativeBinomial(3, 0.3))
works fine. I would therefore propose the generic fallback
function entropy(p::Distribution)
return -Distributions.expectation(x -> logpdf(p, x), p)
end
Any thoughts on this?
There's a general worry expressed in different issues that methods such as expectation silently return incorrect values due to the challenging nature of numerical integration and the simple, unsophisticated, and distribution-unspecific implementation.
Maybe a way around these concerns would be adding a keyword argument analytic=true to the fallback that ensures that an error is thrown (or warning?) if the fallback is hit and analytic is not set to false explicitly.
Wouldn't the same be true for kldivergence? But that function does not have such provisions.
We missed this, and it actually already caused problems: https://github.com/JuliaStats/Distributions.jl/issues/1443