Xianda Sun

Results 87 comments of Xianda Sun

It is a problem if [`setmodel`](https://github.com/TuringLang/DynamicPPL.jl/blob/fe32d948c0338749e85c1ca682feaedb52f72fbb/src/logdensityfunction.jl#L98-L117) is called every Gibbs step. This is conservative for correctness sack, but it would be better to be able to reuse the derived rule...

Yeah, Will and I just talked about this, I am going to address it. Although it might be annoying because we can't dispatch on individual `ADGradientWrapper` (ref https://github.com/tpapp/LogDensityProblemsAD.jl/pull/33). But maybe...

> Can we not use the same approach as we used for Turing.Experimental.Gibbs? Pass the ADType to setmodel (and start using setmodel in Turing.Inference.Gibbs too)? good thought, I think we...

Looks like `Aqua` is not happy with the `==` for `CompsedOptic`, which is essentially defining for `ComposedFunction`. Not sure what to do here.

Also I just realize adding equality for `DynamicIndexLens` is tricky, because the `f` field is anonymous function like `collection -> firstindex(collection)`, which makes equality tests generally impossible. `Recursive` is similar...

`AccessorsImpl` is defined in `BangBang.jl`. DynamicPPL `model` macro use it to signal reference of mutation. Ref: https://github.com/TuringLang/DynamicPPL.jl/blob/c9410de91cfdeffeb939022fabdf042c72c71690/src/compiler.jl#L464 and https://github.com/JuliaFolds2/BangBang.jl/blob/7f61170ec6e4b883f5ece892225d61b9e7b04f8e/src/accessors.jl#L1.

I don't think it would help, but can definitely try later

To echo David's point, I think a default `Transition` type defined in AbstractMCMC would be good. But for now, for the sake of resolving ambiguities, I am okay with this...

Unfortunately, we don't have a score function gradient estimation based BBVI implemented for Turing yet (@Red-Portal @torfjelde correct me if I am wrong). But I think [AdvancedVI.jl](https://github.com/TuringLang/AdvancedVI.jl) has a good...

Thanks @arnauqb. Great efforts! Let me take a look.