keras-tcn icon indicating copy to clipboard operation
keras-tcn copied to clipboard

exchange_rate.txt

Open JonathanHuangC opened this issue 1 year ago • 5 comments

What do the 8 columns in exchange_rate.txt represent? Where did this data come from? Thank you for your response.

JonathanHuangC avatar Oct 09 '23 16:10 JonathanHuangC

@JonathanHuangC Good question. It comes from this repo. I've transferred your question.

https://github.com/laiguokun/multivariate-time-series-data/issues/7

philipperemy avatar Oct 13 '23 01:10 philipperemy

@philipperemy Is there any example of using TCN to extract features? For example, compressing the exchange rate of [1000 * 8] into [1 * 8], where 1000 represents the day and 8 is the feature of the data. Thank you very much for your reply.

JonathanHuangC avatar Oct 20 '23 04:10 JonathanHuangC

You mean some form of auto encoding? You can just search for LSTM features extraction and you swap the LSTM class with the TCN class and it should work.

philipperemy avatar Oct 20 '23 08:10 philipperemy

You mean some form of auto encoding? -> Yes, it seems to be using TCN to implement the auto encoder.

Do you mean that it is enough to change LSTM to TCN? Sorry for there are a lot of questions.

Thank you again for your reply.

LSTM autoencoder

define model

model = Sequential()

Encoder step

model.add(LSTM(15, input_shape=(X_train.shape[1], X_train.shape[2]), activation='relu')) model.add(RepeatVector(X_train.shape[1]))

Decoder step

model.add(LSTM(15, activation='relu', return_sequences=True)) model.add(TimeDistributed(Dense(X_train.shape[2])))

model.compile(optimizer='adam', loss='mse')

history = model.fit(X_train, X_train, epochs=_epochs, batch_size = _batch_size, validation_split=_validation_split, callbacks=callback)

TCN autoencoder

define model

model = Sequential()

Encoder step

model.add(TCN(15, input_shape=(X_train.shape[1], X_train.shape[2]), activation='relu')) model.add(RepeatVector(X_train.shape[1]))

Decoder step

model.add(TCN(15, activation='relu', return_sequences=True)) model.add(TimeDistributed(Dense(X_train.shape[2])))

model.compile(optimizer='adam', loss='mse')

history = model.fit(X_train, X_train, epochs=_epochs, batch_size = _batch_size, validation_split=_validation_split, callbacks=callback)

JonathanHuangC avatar Oct 20 '23 08:10 JonathanHuangC

yeah it's as easy as swapping the class.

philipperemy avatar Oct 20 '23 08:10 philipperemy