combine the stochastic volatility example into a single file
TODO
- [ ] write an introduction (is this indirect inference?)
- [ ] document the motivation and how the model works (is it basically the one in the Corsi-Folvio paper?)
- [ ] benchmark, profile and optimize code, aim for minimizing allocations
- [ ] replace lags and ma with library functions if possible
- [ ] figure out what would constitute as a test
@mcreel, here is the PR, currently a draft. I just moved everything to a single file, but lack the context about this; if you could contribute explanations that would be great.
This needs to be much faster to be included as an example, currently it takes ages for CI.
Thanks for looking at this.
To make the example run, I needed to add using TransformedLogDensities
About adding explanation, I suggest adding the following:
# We estimate a simple discrete time stochastic volatility model, using
# several simulated moment conditions. The likelihood used to create the
# posterior is the asymptotic Gaussian likelihood of the moment conditions.
# Thus, this implements a Bayesian version of the method of simulated
# moments. Theoretical support for the method is
# Chernozhukov, Victor, and Han Hong. "An MCMC approach to classical estimation."
# Journal of econometrics 115, no. 2 (2003): 293-346.
# An example of MH MCMC sampling for the same model and moment conditions is in
# Creel, Michael. 2021. "Inference Using Simulated Neural Moments"
# Econometrics 9, no. 4: 35. https://doi.org/10.3390/econometrics9040035
Regarding the run time, I agree that it is way too long for a useful example. I think that this used to run considerably more quickly, but it has been a long time since I wrote the code, and I may not remember correctly. I seems that it would be nice to be able to do the method of simulated moments using Hamiltonian MC, but the need to fix random draws to get continuity seems problematic, in regard to run time. No doubt reducing allocations would help, but I don't know if the effort is worth it for an example.
For reference, code that estimates the same model, and uses almost identical raw moments, but which filters the moments though a neural net, is at https://github.com/mcreel/SimulatedNeuralMoments.jl/blob/main/examples/SV/SVexample.jl That code uses Turing and advanced MH, and runs in about 30 seconds on 4 threads, for 5000 draws from the posterior. The comparison is not exactly fair, because that method does not require a differentiable likelihood, and because the use of the neural net to reduce the dimension of the moments.
Thanks, I will incorporate the explanation, and do some optimization when time permits.
The comparison is not exactly fair
The relevant comparison is effective samples (ESS) / time.
About ESS, I agree. For the other method, the ESS for the worst parameter is 269 in 23 seconds, for 5000 draws. That was improved during the last few releases, but I forgot to update the README.