a-PyTorch-Tutorial-to-Sequence-Labeling
a-PyTorch-Tutorial-to-Sequence-Labeling copied to clipboard
the dimension of cmap_lengths
in models.py line 184
:param cmap_lengths: character sequence lengths, a tensor of dimensions (batch_size, word_pad_len)
the shape of cmap_lengths should be (batch_size)?