nmt
nmt copied to clipboard
All values in memory_sequence_length must greater than zero
What does it mean?
InvalidArgumentError (see above for traceback): assertion failed: [All values in memory_sequence_length must greater than zero.] [Condition x > 0 did not hold element-wise:] [x (IteratorGetNext:1) = ] [8 8 6...]
[[Node: dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert = Assert[T=[DT_STRING, DT_STRING, DT_STRING, DT_INT32], summarize=3, _device="/job:localhost/replica:0/task:0/device:CPU:0"](dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/All, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert/data_0, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert/data_1, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Assert/Assert/data_2, dynamic_seq2seq/decoder/decoder/while/BasicDecoderStep/decoder/attention/assert_positive/assert_less/Less/Enter)]]
@bota7070 It means you have input sequence with zero length, which is an empty line.
What if the tensor has padding values that should be ignored? This is what it's happening to me. There is nothing wrong with a sequence of padding tokens. I think that they should be just ignored. For example, for my use case, this input is:
[
[
[5, 4, 0, 0, 0],
[5, 6, 2, 0, 0]
],
[
[5, 4, 0, 0, 0],
[0, 0, 0, 0, 0] # padding sequence
]
]
Is there a workaround to this problem?
@aleSuglia To make attention works correctly, we don't want allocate any weights to the paddings, and the sum of all weights must be 1. Therefore, a sequence of all paddings is invalid for the attention module.
I suggest clean up the dataset, so it never contain empty lines. That should be the easiest fix for this problem.
As I've said before, the padding is something that I've intentionally added because it is required by my model in order to process the data in batch mode. My dataset doesn't have any empty lines or any other problem...
I got this in the tutorial for blank lines in /tmp/nmt_attention_model/output_infer
I got this problem too, dead.