transformer-xl
transformer-xl copied to clipboard
is this a bug in pytorch code on invocation of `_update_mems`?
On the invocation of _update_mems(self, hids, mems, qlen, mlen), swap of parameters seems a typo? https://github.com/kimiyoung/transformer-xl/blob/44781ed21dbaec88b280f74d9ae2877f52b492a5/pytorch/mem_transformer.py#L733 https://github.com/kimiyoung/transformer-xl/blob/44781ed21dbaec88b280f74d9ae2877f52b492a5/pytorch/mem_transformer.py#L619
same doubts!!!