Jingxiao Chen
Results
1
issues of
Jingxiao Chen
I found 2 bugs in transformer-xl code `layers.py`. 1. The init_mem function uses an incorrect shape. https://github.com/dhruvramani/Transformers-RL/blob/337d84aebacc383cd2d1bbafdf05dce448ee9382/layers.py#L261-L268 ```python def init_memory(self, device=torch.device("cpu")): return [ torch.empty(0, dtype=torch.float).to(device) for _ in range(self.n_layers +...