Abstractive-Summarization-With-Transfer-Learning
Abstractive-Summarization-With-Transfer-Learning copied to clipboard
Not added position embedding to BERT encoder Input
`
# Creates segment embeddings for each type of tokens.
segment_embedder = tx.modules.WordEmbedder(
vocab_size=bert_config.type_vocab_size,
hparams=bert_config.segment_embed)
segment_embeds = segment_embedder(src_segment_ids)
input_embeds = word_embeds + segment_embeds`
As per BERT paper, the input embeddings are a sum of Embedding Lookup, Segment Embedding and position embedding. As we can see in 'input_embeds = word_embeds + segment_embeds', position embedding is missing.
Position embedding is already part of texar internal code