practical-pytorch icon indicating copy to clipboard operation
practical-pytorch copied to clipboard

error during Visualizing attention in tutorial

Open KyonP opened this issue 7 years ago • 1 comments

I'm tutorial myself with your seq2seq tutorial code.

error pop out this lines output_words, attentions = evaluate("je suis trop froid .") plt.matshow(attentions.numpy())

`--------------------------------------------------------------------------- KeyError Traceback (most recent call last) in () ----> 1 output_words, attentions = evaluate("je suis trop froid .") 2 plt.matshow(attentions.numpy())

in evaluate(sentence, max_length) 1 def evaluate(sentence, max_length=MAX_LENGTH): ----> 2 input_variable = variable_from_sentence(input_lang, sentence) 3 input_length = input_variable.size()[0] 4 5 # Run through encoder

in variable_from_sentence(lang, sentence) 4 5 def variable_from_sentence(lang, sentence): ----> 6 indexes = indexes_from_sentence(lang, sentence) 7 indexes.append(EOS_token) 8 var = Variable(torch.LongTensor(indexes).view(-1, 1))

in indexes_from_sentence(lang, sentence) 1 # Return a list of indexes, one for each word in the sentence 2 def indexes_from_sentence(lang, sentence): ----> 3 return [lang.word2index[word] for word in sentence.split(' ')] 4 5 def variable_from_sentence(lang, sentence):

KeyError: 'trop' ` I haven't modified your code from the original version, this error hasn't been reported yet so I write this post.

KyonP avatar Jan 15 '18 07:01 KyonP

May be, your 'eng-fra.txt' is modified by yourself or some one else. Because this text file has not the context 'je suis trop froid .'

Michi-123 avatar Aug 14 '19 04:08 Michi-123