TCN icon indicating copy to clipboard operation
TCN copied to clipboard

Training on variable-length sequences

Open abedidev opened this issue 3 years ago • 1 comments

Is there any way to handle variable-length sequences and train TCN models on them (the LSTM in PyTorch and also the Keras implementation of TCN can handle variable-length sequences)?

abedidev avatar Jun 26 '21 21:06 abedidev

Outside of TCN, you can try Dynamic Time Warping to make the sequences the same length.

jacobf18 avatar Jul 06 '21 19:07 jacobf18