Pytorch-Sketch-RNN
Pytorch-Sketch-RNN copied to clipboard
Version related, syntax and hyperparameter changes
- Specifying dropout parameter to nn.LSTM with one layer will not apply dropout. You have to separately specify dropout using nn.Dropout and apply it to hidden since the paper talks only recurrent dropout and not input or output dropout.
- The original implementation calls for a 0.9 keep probability but yours calls for a 0.9 dropout probability. You need to change that in the class of Hyperparameters.
- F.softmax requires dimension parameter
dim
to be mentioned. - Removed unnecessary
t()
andsqueeze()
operations and replaced them withview()
directly. - Missing closing parenthesis.
- For tensors with single data element in it, using square brackets with 0 will be considered as error from pytorch version 0.5 onwards. So replaced them with
item()
.