undreamt
undreamt copied to clipboard
AssertionError
Hi,
when using the train.py
script on my corpora (incl. mapped embeddings with vecmap
) the following error message appears (using the --cuda
option):
Traceback (most recent call last):
File "train.py", line 20, in <module>
undreamt.train.main_train()
File "/tmp/undreamt/undreamt/undreamt/train.py", line 189, in main_train
bidirectional=not args.disable_bidirectional, layers=args.layers, dropout=args.dropout))
File "/tmp/undreamt/undreamt/undreamt/devices.py", line 22, in gpu
return x.cuda() if x is not None else None
File "/tmp/anaconda3/envs/cupy/lib/python3.6/site-packages/torch/nn/modules/module.py", line 216, in cuda
return self._apply(lambda t: t.cuda(device))
File "/tmp/anaconda3/envs/cupy/lib/python3.6/site-packages/torch/nn/modules/module.py", line 146, in _apply
module._apply(fn)
File "/tmp/anaconda3/envs/cupy/lib/python3.6/site-packages/torch/nn/modules/rnn.py", line 123, in _apply
self.flatten_parameters()
File "/tmp/anaconda3/envs/cupy/lib/python3.6/site-packages/torch/nn/modules/rnn.py", line 111, in flatten_parameters
params = rnn.get_parameters(fn, handle, fn.weight_buf)
File "/tmp/anaconda3/envs/cupy/lib/python3.6/site-packages/torch/backends/cudnn/rnn.py", line 165, in get_parameters
assert filter_dim_a.prod() == filter_dim_a[0]
AssertionError
I'm using pytorch in version 0.3.1 via conda
- could that be a problem? README.md
shows that 0.3 was tested.
Thanks many in advance + cheers,
Stefan
It looks like the problem is in the pytorch side. Have a look at pytorch/pytorch#5667.