Brian Ward
Brian Ward
@SteveBronder mind looking at https://github.com/stan-dev/stanc3/pull/1521? It needs to be merged at the same time as this
@SteveBronder I think you misread Aki's post, the data you've hardcoded in here is what _works_. The other data, like datat
Changing the data to ``` const std::vector y{183}; const std::vector mu{0.5}; const double sigmaz = 2.5; ``` does indeed lead to the test failing with an exception thrown ``` C++...
By adding some prints to the likelihood it seems the issue here is that `theta` is shooting off to massive values in the optimization: ``` [0] [0] [100.265] [100.265] [100.265]...
> the step should be halved. Reading the [current code](https://github.com/stan-dev/math/blob/develop/stan/math/mix/functor/laplace_marginal_density.hpp#L275), it looks like the stepsize is always 0.5 -- is that what you were referring to @avehtari? There used to...
When you say "halve the step size and keep trying", from where do you keep going? The original point, or the point immediately before failure?
I would expect it to perform better if it was trying again from the original point, because then you're not having to compensate for the initial too-large-but-not-yet-failing steps. Is that...
> Just to be clear, you mean completely restarting the algorithm with a smaller stepsize? More or less, yeah. What it sounds like is your attempt leads to trajectories like...
@avehtari are you referring to the `mean` arguments to the helpers? It should. The instructions steve posted should also download the nightly build of stanc which will know about them
Yes, we currently allow lpdfs in reduce_sum, and I don’t think cdfs would require any different code generation than what we already have