pytorch-metric-learning icon indicating copy to clipboard operation
pytorch-metric-learning copied to clipboard

Allow efficient=True when using CrossBatchMemory in DistributedLossWrapper

Open KevinMusgrave opened this issue 2 years ago • 0 comments

When efficient=True:

  • All embeddings should be added to each rank's CrossBatchMemory.embedding_memory
  • Only the current rank's embeddings should be passed as the first argument to CrossBatchMemory.loss.forward()

KevinMusgrave avatar Mar 17 '22 03:03 KevinMusgrave