flowtron icon indicating copy to clipboard operation
flowtron copied to clipboard

pack_padded_sequence cuda fix

Open SolomidHero opened this issue 5 years ago • 0 comments

Hi! I experienced some troubles with inference based on model inner operations:

It seems like for pytorch 1.7.0, torch.nn.utils.rnn.pack_padded_sequence's src_length must be in cpu, even if we're using cuda: pytorch/pytorch#43227

I also tried this command in pytorch 1.6.0 to check for backward compatibility and there it works fine, both with and without .cpu()

SolomidHero avatar Nov 15 '20 23:11 SolomidHero