arXivNotes icon indicating copy to clipboard operation
arXivNotes copied to clipboard

🚧 2019: Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context

Open jojonki opened this issue 5 years ago • 0 comments

Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context Zihang Dai, Zhilin Yang, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov ACL 2019 long paper. Code and pretrained models are available at this https URL https://arxiv.org/abs/1901.02860

jojonki avatar Apr 03 '20 08:04 jojonki