x-transformers
x-transformers copied to clipboard
Can you share the code implementation of this article"Augmenting Self-attention with Persistent Memory"?
Looking forward to your help!
@1005183361 oh hey, it should be in the repository already https://github.com/lucidrains/x-transformers#augmenting-self-attention-with-persistent-memory
just follow this thread of logic https://github.com/lucidrains/x-transformers/blob/main/x_transformers/x_transformers.py#L488
just follow this thread of logic https://github.com/lucidrains/x-transformers/blob/main/x_transformers/x_transformers.py#L488
Thank you for your contributions and answers!!