BiDAF
BiDAF copied to clipboard
Character Level Embedding
https://github.com/jojonki/BiDAF/blob/3e5ac9c76d02de2d8f75b1eda6632f8a9432eba6/layers/char_embedding.py#L28
I feel strange to see this code.
Why do you sum over word_len
dimension?
Why don't you apply 1D filter over word_len
dimension?
Thank you.