vggt
vggt copied to clipboard
How to limit dataset usage
Hello
I am trying to fine-tune the vggt model with multiple datasets and was wondering how can I limit the amount that each epoch uses a dataset? For example 10% of each dataset for each epoch. So at each epoch the dataloader would pick 10% of the whole dataset randomly and train the model with that. This way can help for increasing fine-tuning speed given the limited number of GPUs I have (4).
Thank You Best
Hi for this purpose I would suggest setting len_train for each dataset manually