size mismatch errror assert len(modules) == len(inputs) in parallel 168.
do we need to scatter the input by ourself, since it is weird that len(modules) should equal to the len(inputs) ? hope your help, thanks!
Hey @v-wewei, I was also struck with the same error. Were you able to solve it? Thanks:)
Which file is it?
Looks like you mean parallel.py. That is needed for data parallel. Otherwise, you may use pytorch distributed data parallel, which may be simpler.
Hi, thank you for your excellent work. I meet the same problem, any solutions about it? I extract the parallel.py and use DataParallelCriterion like this:
criterion = DataParallelCriterion(criterion)
for idx, (images, targets) in enumerate(dataloader):
preds = model(images)
loss = criterion(preds, targets)
however, I get the error as follow:
assert len(targets) == len(inputs)
AssertionError
It seems that targets are assigned to different gpus but preds not be assigned... I let the output of model is tuple type as your code do, but still not works... I guess what they ask about is the same as mine...