a-PyTorch-Tutorial-to-Sequence-Labeling icon indicating copy to clipboard operation
a-PyTorch-Tutorial-to-Sequence-Labeling copied to clipboard

Sorting before pack_padded_sequence is unnecessary

Open eridgd opened this issue 5 years ago • 0 comments

As of PyTorch 1.1.0, the inputs to pack_padded_sequence do not need to be sorted by length if enforce_sorted=False:

https://pytorch.org/docs/master/nn.html#torch.nn.utils.rnn.pack_padded_sequence

enforce_sorted (bool, optional) – if True, the input is expected to contain sequences sorted by length in a decreasing order. If False, this condition is not checked. Default: True.

Thanks for the great tutorial!

eridgd avatar Jun 08 '19 21:06 eridgd