gsplat icon indicating copy to clipboard operation
gsplat copied to clipboard

Problem for multigpu training about sampler

Open EhabWilson opened this issue 1 year ago • 2 comments

It is common to use dataloader with distributed sampler when training with multi gpus. So why not use a distributed sampler in examples/simple_trainer.py, is that for any reason?

EhabWilson avatar Sep 09 '24 04:09 EhabWilson

I'm not quite familiar with DistributedSampler. What's the benefit of using that?

liruilong940607 avatar Sep 13 '24 18:09 liruilong940607

It allows to use only a subset of the original data for each process, maybe thus decrease iterations needed to traverse the full dataset?

EhabWilson avatar Sep 23 '24 12:09 EhabWilson