PyTorch-Encoding
PyTorch-Encoding copied to clipboard
About replace in ReLU and Dropout
I noticed that a lot of your code is written like this
nn.ReLU(True),
nn.Dropout(0.1, False),
why you use inplace operator in activation fuction but not in dropout? Does this have any special meaning?