Xianda Sun
Xianda Sun
@zeyus this looks good, thanks a lot for the contribution. Just for the ease of future maintenance, maybe we can move the function `ind_from_string` out of `ParetoSmooth.pointwise_log_likelihoods` and write some...
@devmotion if I understand your point correctly, are you suggesting we should give some kind of ordering to `VarName`s? Without a `isless`, `OrderedDict` should still work (it just use the...
closing this, will start a similar work after https://github.com/TuringLang/DynamicPPL.jl/pull/716 and https://github.com/TuringLang/DynamicPPL.jl/pull/710
> I don't think applying STL would help in the case of the score gradient. May I ask what is "STL"? > Also, unfortunately, we probably can't handle things automatically...
DynamicPPL aims to enable users to write generic Julia code for model definition. While Graph-oriented DSLs like GraphPPL (and JuliaBUGS) are bound to have some restrictions to the syntax, although...
Very good work @arnauqb @Red-Portal, thanks for pushing this through
Sorry for letting this go under my radar. From the look of it, I don't think `setmodel` is directly useful, but we can modify `setvarinfo` to mirror the implementation of...
@willtebbutt @mhauru I think the `setvarinfo` related errors are gone (at least from the look of it)
I am not certain. @willtebbutt will `ADgradient(AutoTapir(), ::DynamicPPL.LogDensityFunction)` trigger a rule re-deriving?
I am not 100% certain about this, but for the old Gibbs, maybe https://github.com/TuringLang/Turing.jl/blob/803d2f5672b7483c768b9987bcec0dc20257ebda/src/mcmc/gibbs.jl#L241C27-L241C87 will lead to call to `ADgradient`? @torfjelde