TCN
TCN copied to clipboard
Training on variable-length sequences
Is there any way to handle variable-length sequences and train TCN models on them (the LSTM in PyTorch and also the Keras implementation of TCN can handle variable-length sequences)?
Outside of TCN, you can try Dynamic Time Warping to make the sequences the same length.