skiprnn_pytorch
skiprnn_pytorch copied to clipboard
Can this code deal with variable-length sequence inputs?
Hi,
Thanks for this good work!
I am using Skip-LSTM on my experiment now. And it seems to work well.
However, I am wondering how can this code deal with variable-length sequence inputs?
When using RNN/LSTM in pytorch, we can use torch.nn.utils.rnn.pack_padded_sequence and torch.nn.utils.rnn.pad_packed_sequence to make the model not to consider the padding vectors as input.
Are there any alternative ways to do the same thing? Thanks a lot!