Hong Ge
Hong Ge
There are some excellent packages for estimating Bayesian evidence for Turing models. It would allow us to perform model comparisons for various priors and model choices. We should consider supporting...
`SimpleVarInfo` in `DynamicPPL` provides significant speedup over the current implementation. We should consider switching to use it once https://github.com/TuringLang/DynamicPPL.jl/pull/360 is merged.
- The Python ensemble sampling toolkit for affine-invariant MCMC https://github.com/dfm/emcee - A Julia implementation of the RAM algorithm (Vihola, 2012) https://github.com/anthofflab/RobustAdaptiveMetropolisSampler.jl - Keep it simple, stupid, MCMC https://github.com/mauro3/KissMCMC.jl - Affine...
The current plan for this package is to provide a robust random-walk MH sampler, and its adaptive variants. But this could be relaxed to include slice samplers (e.g. Radford Neal's...
See https://github.com/yebai/Turing.jl/issues/373#issuecomment-345555122
This PR limits warning messages to 10. Is that a good default? Solution 1: we can probably disable these numerical messages, and instead print a summary message about total divergent...
It seems the `essential/ad.jl` file - doesn't depend on any Turing-specific code - fits better with DynamicPPL since it provides autodiff to `LogDensityFunction` so, I suggest that we transfer `ad.jl`...
The following line https://github.com/TuringLang/Turing.jl/blob/6649f10c48917a27531214f02777408d2ab82928/src/optimisation/Optimisation.jl#L27 provides a very readable way to specify which space model parameters are living. We use somewhat ad-hoc ways of specifying this information in `DynamicPPL` and `Turing`....
See, [here](https://github.com/JuliaDocs/Documenter.jl/pull/2239) for an example.