Define fallback implementations for mean, var, and entropy
We have the internal expectation function that uses quadgk to compute integrals, as well as a fallback implementation for kldivergence that uses expectation, so it seems reasonable to similarly define fallbacks for other quantities trivially computable using expectation: mean, var, and entropy. We could probably do skewness and kurtosis as well.
To do:
- [ ] Determine the set of functions that should support
expectation-based fallbacks - [ ] Add tests
- [ ] Document that the fallbacks exist
(n.b. I skipped CI on the initial commit purposefully as I haven't given this a ton of thought yet and wanted to see whether anybody was strongly for or against it before doing meaningful work here)
Copied from https://github.com/JuliaStats/Distributions.jl/pull/1874#issuecomment-2191052752:
My worry (that was also expressed in issues such as https://github.com/JuliaStats/Distributions.jl/issues/968) is that generally numerical integration is challenging and a fallback might lead to silently incorrect results. It seems such a fallback would be wrong (or at least problematic) e.g. if the moments are not finite (such as e.g. for Cauchy).
So my general feeling is that numerical integration should maybe be restricted to a smaller subset of distributions, or maybe even only be available as a separate function. In case we want to use it more broadly, I think it would also be safer to error if the integration error estimate is too large, to reduce the probability of silently incorrect results.
Generally agree with @devmotion . I've hit a similar issue for kldivergence here. In general, its not a good idea to have a silent approximation method and its better to let the user decide
Yep, fair enough.