tensorflow-cnn-time-series
tensorflow-cnn-time-series copied to clipboard
Feeding images of time series to Conv Nets! (Tensorflow + Keras)
Using CNN on 2D Images of Time Series
Because too often time series are fed as 1-D vectors Recurrent Neural Networks (Simple RNN, LSTM, GRU..).
Will this time series go up or down in the next time frame?
Which plot contains highly correlated time series?
Possible advantages/drawbacks of such approach:
Advantages
- Almost no pre-processing. Feed raw pixels (be careful of the resolution of the image though)!
- We can add several time series on the same plot or on a different plot and concatenate both images.
- Conv Nets have the reputation of being more stable than Recurrent Neural Networks for many tasks (WaveNet is just one example).
- No vanishing/exploding gradient! Even though, it's less true with LSTM.
Drawbacks
- Input is much bigger than feeding 1-D vectors. Actually it's very very sparse!
- Training will be undoubtedly slower.
- Sometimes it's also hard to train very big conv nets (VGG19 is such an example).
Let's get started!
Fake data generation
git clone https://github.com/philipperemy/tensorflow-cnn-time-series.git
cd tensorflow-cnn-time-series/
sudo pip3 install -r requirements.txt
python3 generate_data.py
Start the training of the CNN (AlexNet is used here)
python3 alexnet_run.py
Toy example: Binary classification of images of time series
We consider the following binary classification problem of time series:
- UP: If the time series went up in the next time frame.
- DOWN: if the time series went down.
Because it's impossible to classify pure random time series into two distinct classes, we expect a 50% accuracy on the testing set and the model to overfit on the training set. Here are some examples that we feed to the conv net:


Keep in mind that LSTM is also good!
python3 lstm_keras.py # on correlation classification task
[...]
[test] loss= 0.021, acc= 100.00
[test] loss= 0.004, acc= 100.00
[test] loss= 0.004, acc= 100.00