pytorch-metric-learning icon indicating copy to clipboard operation
pytorch-metric-learning copied to clipboard

CrossBatchMemory warmup & queue initialization

Open jhgan00 opened this issue 2 years ago • 4 comments

Hello, I have a question about CrossBatchMemory.

According to the paper (Wang et al, 2020),

As the feature drift is relatively large at the early epochs, we warm up the neural networks with 1k iterations, allowing the model to reach a certain local optimal field where the embeddings become more stable. Then we initialize the memory module M by computing the features of a set of randomly sampled training images with the warm-up model.

Is this kind of warm-up and queue initialization implemented on the CrossBatchMemory? If so, I wonder how I can use it. If you have an example, it would be very helpful if you could provide it.

Thank you.

jhgan00 avatar May 30 '22 00:05 jhgan00

That isn't implemented in CrossBatchMemory. I suppose I could add it for convenience. Maybe it could be used like this:

from pytorch_metric_learning.losses import ContrastiveLoss, CrossBatchMemory

loss_fn_xbm = CrossBatchMemory(ContrastiveLoss(), embedding_size, warmup_iters=1000)

for i in range(num_iters):
    loss_fn_xbm(embeddings, labels, iter=i)

Then the forward function would do something like:

    def forward(self, embeddings, labels, indices_tuple=None, enqueue_idx=None, iter=None):
        if self.warmup_iters and iter <= self.warmup_iters:
            return self.loss(embeddings, labels)
        ...

I wouldn't implement the feature extraction loop in CrossBatchMemory. But I could add a helper function that does something like this:

for imgs, labels in dataset:
    if loss_fn_xbm.has_been_filled:
        break
    embeddings = model(imgs)
    loss_fn_xbm.add_to_memory(embeddings, labels, batch_size)

KevinMusgrave avatar May 30 '22 02:05 KevinMusgrave

Thank you for you reply !! I think it would be great if those features were added :)

jhgan00 avatar May 30 '22 02:05 jhgan00