Pytorch-seq2seq-Beam-Search icon indicating copy to clipboard operation
Pytorch-seq2seq-Beam-Search copied to clipboard

PyTorch implementation for Seq2Seq model with attention and Greedy Search / Beam Search for neural machine translation

Results 2 Pytorch-seq2seq-Beam-Search issues
Sort by recently updated
recently updated
newest added

如上图,我本身target_sentence(包括SOS,EOS,pading)的长度设定的是8,但是我在测试的时候总会返回大量随机长度的tokens。请问如何解决

Hi 312shan, great work! I was analyzing your code and was curious about parts of the forward for the Decoder and Attention module. . In the forward function of the...