nmt icon indicating copy to clipboard operation
nmt copied to clipboard

All values in memory_sequence_length must greater than zero

Open bota7070 opened this issue 7 years ago • 6 comments

What does it mean?

InvalidArgumentError (see above for traceback): assertion failed: [All values in memory_sequence_length must greater than zero.] [Condition x > 0 did not hold element-wise:] [x (IteratorGetNext:1) = ] [8 8 6...]
	 [[Node: dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert = Assert[T=[DT_STRING, DT_STRING, DT_STRING, DT_INT32], summarize=3, _device="/job:localhost/replica:0/task:0/device:CPU:0"](dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/All, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert/data_0, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert/data_1, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert/data_2, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Less/Enter)]]

bota7070 avatar Dec 12 '17 09:12 bota7070

@bota7070 It means you have input sequence with zero length, which is an empty line.

oahziur avatar Dec 12 '17 16:12 oahziur

What if the tensor has padding values that should be ignored? This is what it's happening to me. There is nothing wrong with a sequence of padding tokens. I think that they should be just ignored. For example, for my use case, this input is:

[
        [
            [5, 4, 0, 0, 0],
            [5, 6, 2, 0, 0]
        ],
        [
            [5, 4, 0, 0, 0],
            [0, 0, 0, 0, 0] # padding sequence
        ]
    ]

Is there a workaround to this problem?

aleSuglia avatar Jan 19 '18 19:01 aleSuglia

@aleSuglia To make attention works correctly, we don't want allocate any weights to the paddings, and the sum of all weights must be 1. Therefore, a sequence of all paddings is invalid for the attention module.

I suggest clean up the dataset, so it never contain empty lines. That should be the easiest fix for this problem.

oahziur avatar Jan 23 '18 18:01 oahziur

As I've said before, the padding is something that I've intentionally added because it is required by my model in order to process the data in batch mode. My dataset doesn't have any empty lines or any other problem...

aleSuglia avatar Jan 23 '18 22:01 aleSuglia

I got this in the tutorial for blank lines in /tmp/nmt_attention_model/output_infer

EddieOne avatar Feb 24 '18 03:02 EddieOne

I got this problem too, dead.

wwx13 avatar Mar 07 '19 13:03 wwx13