keras-tcn
keras-tcn copied to clipboard
exchange_rate.txt
What do the 8 columns in exchange_rate.txt represent? Where did this data come from? Thank you for your response.
@JonathanHuangC Good question. It comes from this repo. I've transferred your question.
https://github.com/laiguokun/multivariate-time-series-data/issues/7
@philipperemy Is there any example of using TCN to extract features? For example, compressing the exchange rate of [1000 * 8] into [1 * 8], where 1000 represents the day and 8 is the feature of the data. Thank you very much for your reply.
You mean some form of auto encoding? You can just search for LSTM features extraction and you swap the LSTM class with the TCN class and it should work.
You mean some form of auto encoding? -> Yes, it seems to be using TCN to implement the auto encoder.
Do you mean that it is enough to change LSTM to TCN? Sorry for there are a lot of questions.
Thank you again for your reply.
LSTM autoencoder
define model
model = Sequential()
Encoder step
model.add(LSTM(15, input_shape=(X_train.shape[1], X_train.shape[2]), activation='relu')) model.add(RepeatVector(X_train.shape[1]))
Decoder step
model.add(LSTM(15, activation='relu', return_sequences=True)) model.add(TimeDistributed(Dense(X_train.shape[2])))
model.compile(optimizer='adam', loss='mse')
history = model.fit(X_train, X_train, epochs=_epochs, batch_size = _batch_size, validation_split=_validation_split, callbacks=callback)
TCN autoencoder
define model
model = Sequential()
Encoder step
model.add(TCN(15, input_shape=(X_train.shape[1], X_train.shape[2]), activation='relu')) model.add(RepeatVector(X_train.shape[1]))
Decoder step
model.add(TCN(15, activation='relu', return_sequences=True)) model.add(TimeDistributed(Dense(X_train.shape[2])))
model.compile(optimizer='adam', loss='mse')
history = model.fit(X_train, X_train, epochs=_epochs, batch_size = _batch_size, validation_split=_validation_split, callbacks=callback)
yeah it's as easy as swapping the class.