memorizing-transformers-pytorch icon indicating copy to clipboard operation
memorizing-transformers-pytorch copied to clipboard

Support for Multi-GPU training?

Open Victorwz opened this issue 1 year ago • 0 comments

Thank you so much for the great implementation. I would like to ask whether your implementation for Memorizing Transformer could support multi-card distributed training like original paper. If you distribute the memorizingtrransformer model you created to each GPU, then every GPU would hold a memory with a retrieval faiss index. Therefore, each model on different GPU holds different memory database and retrieval index, which is different from the original paper. I regard that each model on different GPU should share the same retrieval context. This problem confuses me a lot.

Thank you so much for your time. Looking forward to your response!

Victorwz avatar Aug 16 '22 15:08 Victorwz