stan
stan copied to clipboard
Stan development repository. The master branch contains the current release. The develop branch contains the latest stable development. See the Developer Process Wiki for details.
#### Summary: From @bob-carpenter > Our loggers should just take these, as in: > > logger.info() > logger.info("Gradient evaluation took ", deltaT, " seconds"); #### Description: Our loggers were introduced...
Revert revert of #3048
#### Summary: Stan writes to the output csv file of each chain something like ``` # Elapsed Time: # 0.008 seconds (Warm-up) # 0.004 seconds (Sampling) # 0.012 seconds (Total)...
Idea from Aki Vehtari on stan-dev: In Stan 2.8, log density functions like `normal_log()` output the sum of the densities of their components (with appropriate broadcasting of scalars). It would...
#### Summary: Allow `_lupdf` and `_lupmf` in the transformed parameters block. #### Description: For implementing prior sensitivity checks via power scaling (https://arxiv.org/abs/2107.14054; a paper together with @avehtari), we need to...
#### Summary: CRAN identified with their clang sanitiser tests (see here: https://cran.csiro.au/doc/manuals/r-patched/R-exts.html#Using-Address-Sanitizer) issues in one of my Stan models on CRAN. When tracing down the log file it appears that...
#### Summary: In a non-centered normal distribution, if the standard deviation is initialized to a very low value, it crashes to an extremely low value and never recovers. Manual non-centering...
Hello I translated a stan model using stanc3(CMDSTAN 2.28.1). I included the .hpp file in Xcode(MacOSX Mojave). I have Eigen 3.4 installed, Sundials, Boost , TBB 2020.3 I am getting...
#### Description: We need some easy to follow instructions on how to use the core Stan inside a user-written C++ program. See https://github.com/stan-dev/stan/issues/3085 for example. The instruction can simply guide...
#### Summary: The number of log prob gradient evaluations per sample is one greater than the reported `n_leapfrog` for that sample. This does not need to be so; the gradient...