Rémi Louf
Rémi Louf
In this PR I add an example that uses [Aesara](https://github.com/aesara-devs/aesara) and [Aeppl](https://github.com/aesara-devs/aeppl) to build the model's logprob and compile it to JAX. Closes #173 ```python import aesara import aesara.tensor as...
The library has a history of not entirely working with PyTrees, see #216 for instance. We should make sure that it does (adding tests) before the first stable release.
In particular the implementation should be general enough that we can shard a large dataset on several machines, compute the partial gradient on each machine and combine its value before...
The full test suite currently takes > 10 mins to run, mainly because of the tests for the SMC algorithms. We should aim to bring this down under two minutes.
The RMH acceptance step is used by many MCMC algorithms across the library and we should thus be able to import it from a separate module. Should probably go in...
It currently works with the old `kernel_factories` design, see discussion in #246
Blackjax has two stochastic gradient algorithms implementations, but to use them one needs to specify hyperparameters manually, namely the batch size, step size and schedule for SGLD, plus the number...
Hey :wave: You are here because you considered contributing to `blackjax` for at least a split second. Thank you! But sometimes we are just not quite sure what to work...
The window adaptation is currently only available via a function that implements and jit-compiles the loop. While this is convenient and should be kept, we should also expose the window...
https://github.com/WayneDW/Contour-Stochastic-Gradient-Langevin-Dynamics