dpsimpson
dpsimpson
This is so cool that this is possible now! But this is the wrong Laplace branch. The good one is here https://github.com/stan-dev/math/tree/try-laplace_approximation2/stan/math/laplace On Sun, Sep 27, 2020 at 15:01 Steve...
Let me know if you need help! On Mon, Dec 6, 2021 at 11:47 Charles Margossian ***@***.***> wrote: > Todo list for myself: > [] write a primer design doc...
It's hard to work out how to test this as it currently is - the function does 3 things: competes Laplace approximation (tested elsewhere), computes the mean and covariance matrix...
The other option is to do a probabilistic test that runs the rng 1k+++ times and computes the mean and covariances, but that feels like it would be a very...
Actually, there is one other possibility, that depends on how the `rng` argument works. If you can control its state, you could compare the output from `laplace_base_rng` to the `multi_normal_rng`...
I think that’s too heavy for such a simple function. Equality of a matrix and a vector OR equality of a draw made with the same random seed is enough...
I got confused for a moment by `nnz_rows` which I interpreted as "number of non-zero rows", not the rows that have non-zeros in them (it's the first n that's the...
`1/aux ` makes sense to be called "dispersion". Maybe call the inverse square root "dispersion deviation"? Or is that too weird? On 26 April 2018 at 10:48, Jonah Gabry wrote:...
Yes! I’ve been clobbered by other work. But I’m happy to chat with you if you want a sounding board / rubber duck / or general INLA stuff On Wed,...
If it’s not a type it might as well not be done. It would be extremely difficult to use. Akin to not having a dense matrix type and making users...