Jason Chiu
Results
1
comments of
Jason Chiu
When `:backward()` is called on the network (e.g. those LSTM modules), `grad_params` is updated, since the gradient tensors in the modules point to some sub-array of `grad_params`. This was done...