Ryan
Ryan
It might be interesting to compare this to Dustin's idea of expressing the FFT of the pre-convolution MOG in closed form. My intuition is that this will work better, but...
One somewhat invasive idea would be to make each element of a ```SensitiveFloat```'s Hessian matrix itself be a matrix of complex numbers (rather than a real, as it is now)....
I am not an epidemiologist (it would be great if an actual professional epidemiologist would weigh in here), but it seems that the key classes of outcomes are asymptomatic, symptomatic...
I think probably I didn't express myself clearly enough. As I understand it, the gradient is exposed in ```stan::model::log_prob_grad```. I propose making corresponding functions ```stan::model::log_prob_hessian_times_vector``` and ```stan::model::log_prob_hessian```. @avehtari requested that...
In light of https://github.com/stan-dev/rstan/issues/588, I wonder if I should be basing this change on the ```develop``` branch or on the ```master``` branch of ```stan```. If it's based on ```develop```, do...
Just to be clear, I'd need a branch of ```rstan``` that works with ```stan-dev/stan/develop```, not with ```stan-dev/math/develop```. How would I do that?
Hopefully any ambiguity in this issue will be cleared by the PR: https://github.com/stan-dev/stan/pull/2701
@bob-carpenter, I'd be interested in helping out testing forward-mode autodiff if it's something an outsider can handle. Where would one go to get started?
> We can't add new higher-order methods to our base class without dramatically increasing compile times, at least in any way that I know how to do. @bob-carpenter, now that...
I too could really use the `autograd_` functionals for [zaminfluence](https://github.com/rgiordan/zaminfluence). I thought I'd check in on the status of this issue and to offer to help if I can!