seq2seq
seq2seq copied to clipboard
What's the exact Pytorch and Torchtext version for your code? I am trying to downgrade to a previous version in order to avoid the Multi30k.split() problem but failed.
What's the exact Pytorch and Torchtext version for your code? I am trying to downgrade to a previous version in order to avoid the Multi30k.split() problem but failed.
I can't remember.. sorry I need to re-write this in PyTorch v1.0
Here is another bug that I met for the code, do you have any idea on how to revise it?
/workspace/seq2seq/model.py:47: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
energy = F.softmax(self.attn(torch.cat([hidden, encoder_outputs], 2)))
Traceback (most recent call last):
File "train.py", line 112, in
@yaoyiran you might wanna remove the dim
keyword arg from relu
and also add dim=2
in the softmax
and see if that resolves the issue. What version of pytorch
are you using?
after upgrading torchtext to 0.3x,you can solve this problem. I used to use 0.23, and ran into the same problem as you.
@pskrunner14
why not update your code in git. I also use your attention model and get the same error.
I use the pytorch 1.0