kg-bert icon indicating copy to clipboard operation
kg-bert copied to clipboard

Using `.add_tokens`

Open MichaelHirn opened this issue 5 years ago • 3 comments

If I'm correct you are using the description of an entity (or relationship) and tokenize that description. The entities and relationships do not have their own tokens, right?

Did you try to learn an embedding for an entity/relationship or does that not really make any sense?

MichaelHirn avatar Feb 12 '20 18:02 MichaelHirn

@MichaelHirn

Yes, I am only using tokens in the description of an entity or a relation.

The embedding for an entity/relationship can be the average of their description token embeddings (hidden state of BERT), but in the knowledge graph completion tasks, the embeddings are not necessary.

yao8839836 avatar Feb 15 '20 19:02 yao8839836

Ahh I see, thanks for your comment.

I'm wondering if you have tried creating a new token (optionally initialising the token with a related embedding or even with the average of the token embeddings) and learning an embedding for that new token. I'm trying to adapt your approach for our use-case where using the description of an entity does not work and and am curious if you have tried that approach and if so how it worked out.

MichaelHirn avatar Feb 16 '20 23:02 MichaelHirn

@MichaelHirn

I didn't try this.

yao8839836 avatar Feb 17 '20 06:02 yao8839836