speech icon indicating copy to clipboard operation
speech copied to clipboard

Error in seq2seq.py of Method collate

Open lypenghao opened this issue 6 years ago • 3 comments
trafficstars

/home/wangph/code/pytorch_egs/speech/speech/models/seq2seq.py:235: UserWarning: volatile was removed and now has no effect. Use with torch.no_grad(): instead. inputs.volatile = True /home/wangph/code/pytorch_egs/speech/speech/models/seq2seq.py:236: UserWarning: volatile was removed and now has no effect. Use with torch.no_grad(): instead. labels.volatile = True

Traceback (most recent call last): File "train.py", line 146, in run(config) File "train.py", line 109, in run dev_loss, dev_cer = eval_dev(model, dev_ldr, preproc) File "train.py", line 58, in eval_dev loss = model.loss(batch) File "/home/wangph/code/pytorch_egs/speech/speech/models/seq2seq.py", line 53, in loss x, y = self.collate(*batch) TypeError: collate() missing 2 required positional arguments: 'inputs' and 'labels'

lypenghao avatar Feb 18 '19 09:02 lypenghao

Probably, this is because batch is zip object, and you can iterate over batch only once. Adding 'batch = list(batch)' at eval_dev() in the for loop helps me.

dmitrii-obukhov avatar Apr 09 '19 03:04 dmitrii-obukhov

I'm currently facing the same issue, I added batch = list(batch) and manually replaced collate with collate(batch[0], batch[1]) and the issue becomes index 1 out of range.

ghost avatar Jun 26 '19 06:06 ghost

+1

iamxiaoyubei avatar Jul 31 '19 03:07 iamxiaoyubei