composer icon indicating copy to clipboard operation
composer copied to clipboard

Set `torch.utils.data.DataLoader.batch_sampler` epoch if defined

Open Ghelfi opened this issue 11 months ago • 1 comments

🚀 Feature Request

Current implementation of composer trainer call the DistributedSampler.set_epoch method only on the Dataloader.sampler attribute but not on the Dataloader.batch_sampler, even if it is defined. One example here.

Motivation

When doing distributed learning based on batch sampler one might want the epoch to be properly set on the batch_sampler since it is usually used to seed new seed over time. This could be useful for Metric learning where batch_sampler can be a worth feature. For now composer trainer only handle regular sampler.

Implementation

I'll propose a PR with a technical implementation. The idea is to check if batch_sampler is defined, in which case we can set the epoch properly, else we do it on the regular sampler (always defined in torch.utils.data.DalaLoader).

Ghelfi avatar Mar 18 '24 13:03 Ghelfi