pymc-experimental
pymc-experimental copied to clipboard
Add ETS model
Adds BayesianETS
as an available statespace model.
According to this book, this is great baseline model for forecasting. I don't have a lot of experience with it, but statsmodels implements it, so I did too.
Todo:
- [ ] Write an example notebook
- [ ] Add a class method utility to decompose the time series into level, trend, and seasonal components
- [ ] Test time series generation against statsmodels
- [ ] Check whether the first state is correctly interpreted in the Kalman filter
The last point is a but subtle. From the statsmodel implementation:
One consequence is that the "initial state" corresponds to the "filtered" state at time t=0, but this is different from the usual state space initialization used in Statsmodels, which initializes the model with the "predicted" state at time t=1.
I have no idea whether this is important or not for the present implementation, I need to look at it more closely.
Some comments: In general I just follow the statsmodel implementation, and I'm testing that all the statespace matrices match those of statsmodels. The one difference is in the selection matrix. Looking at the book equations, for example:
$$ \begin{align} y_t &= \ell_t + \epsilon_t \ \ell_t &= \ell_{t-1} + \alpha \epsilon_t \end{align} $$
The general statespace equation is:
$$ \begin{align} x_t &= T x_{t-1} + c + R \epsilon_t \ y_t &= Z x_t + d + H \end{align} $$
This implies states:
$$x_t = \begin{bmatrix} \epsilon_t \ \ell_t \end{bmatrix}$$
So:
$$T = \begin{bmatrix} 0 & 0 \ 0 & 1 \end{bmatrix}, R = \begin{bmatrix} 1 \ \alpha \end{bmatrix} $$
This all seems straightforward, but statsmodels sets the (0, 0) element of $R$ to be $1 - \alpha$. That really doesn't seem right, so I didn't do it.