Dynamic-memory-networks-plus-Pytorch icon indicating copy to clipboard operation
Dynamic-memory-networks-plus-Pytorch copied to clipboard

postion encoding

Open yufengm opened this issue 7 years ago • 1 comments

Are we supposing the same length for each sentence in the position encoding?

https://github.com/dandelin/Dynamic-memory-networks-plus-Pytorch/blob/ad49955f907c03aade2f6c8ed13370ce7288d5a7/babi_main.py#L18

As in above, each sentence encoding is divided by the same number elen-1.

yufengm avatar Apr 24 '18 15:04 yufengm

I suppose you could pad the variated sentences length to a fixed number in the preprocessing step before feeding them into the model.

AveryLiu avatar May 14 '18 14:05 AveryLiu