BLIP icon indicating copy to clipboard operation
BLIP copied to clipboard

Pre-training with LAION

Open aries-young opened this issue 1 year ago • 0 comments

Hello, I have a question about the pre-training with laion: in the pretrain.py, I find the data of laion are added to dataloader as the following snippet. Does it mean that the data of laion are splited to some subsets and using different subset added into COCO+VG+CC+SBU to form the train data for a epoch?

def train(model, data_loader, optimizer, epoch, device, config):
	# ...
    
    if config['laion_path']:
        data_loader.dataset.reload_laion(epoch)

    # ...

aries-young avatar Jul 06 '23 08:07 aries-young