brainstorm icon indicating copy to clipboard operation
brainstorm copied to clipboard

PyCudaHandler: Crash in LSTM backpass if sequence length is 1

Open michaelwand opened this issue 9 years ago • 0 comments

The LSTM backpass with Cuda does not work properly in the (rare, but possible) case that the length of an input sequence is 1. On the CPU, everything works fine.

The underlying reason is that an array of size (0,whatever) is allocated, which apparently works with Numpy, but results in an uninitialized array (gpudata==None) with PyCuda, on which subsequent operations ("fill" in the attached backtrace) fail. (To the best of my knowledge, this occurs first in lstm_layer.py, line 264 - flat_cell may have zero size. But I cannot guarantee that's the only occurrence.)

Further information: $ git log --oneline -n 1 a68bf03 Release 0.5 $ uname -a Linux nikola 3.16.0-4-amd64 #1 SMP Debian 3.16.7-ckt7-1 (2015-03-01) x86_64 GNU/Linux

... and a backtrace, as well as a script to create the behavior (UseGPU and MakeItCrash must be set to 1!!) LSTMCrashWithGPUandLength1Seq.py.txt LSTMCrashBacktrace.txt

michaelwand avatar Dec 30 '15 16:12 michaelwand