a-PyTorch-Tutorial-to-Sequence-Labeling
a-PyTorch-Tutorial-to-Sequence-Labeling copied to clipboard
Sorting before pack_padded_sequence is unnecessary
As of PyTorch 1.1.0, the inputs to pack_padded_sequence
do not need to be sorted by length if enforce_sorted=False
:
https://pytorch.org/docs/master/nn.html#torch.nn.utils.rnn.pack_padded_sequence
enforce_sorted (bool, optional) – if True, the input is expected to contain sequences sorted by length in a decreasing order. If False, this condition is not checked. Default: True.
Thanks for the great tutorial!