PyTorch-Encoding icon indicating copy to clipboard operation
PyTorch-Encoding copied to clipboard

size mismatch errror assert len(modules) == len(inputs) in parallel 168.

Open v-wewei opened this issue 6 years ago • 4 comments

do we need to scatter the input by ourself, since it is weird that len(modules) should equal to the len(inputs) ? hope your help, thanks!

v-wewei avatar May 27 '19 13:05 v-wewei

Hey @v-wewei, I was also struck with the same error. Were you able to solve it? Thanks:)

user432 avatar Jul 21 '20 04:07 user432

Which file is it?

zhanghang1989 avatar Jul 21 '20 05:07 zhanghang1989

Looks like you mean parallel.py. That is needed for data parallel. Otherwise, you may use pytorch distributed data parallel, which may be simpler.

zhanghang1989 avatar Jul 21 '20 05:07 zhanghang1989

Hi, thank you for your excellent work. I meet the same problem, any solutions about it? I extract the parallel.py and use DataParallelCriterion like this:

criterion = DataParallelCriterion(criterion)
for idx, (images, targets) in enumerate(dataloader):
    preds = model(images)
    loss = criterion(preds, targets)

however, I get the error as follow:

assert len(targets) == len(inputs)
AssertionError

It seems that targets are assigned to different gpus but preds not be assigned... I let the output of model is tuple type as your code do, but still not works... I guess what they ask about is the same as mine...

CharlesPikachu avatar Aug 23 '20 13:08 CharlesPikachu