Fast_Sentence_Embeddings icon indicating copy to clipboard operation
Fast_Sentence_Embeddings copied to clipboard

out-of-vocabulary imputation?

Open KnutJaegersberg opened this issue 2 years ago • 2 comments

Have you considered working on meta embeddings and embedding imputation? I think fse might practically challenge some deep learning architectures, especially when taking knowledge graph embeddings into account.

KnutJaegersberg avatar Mar 29 '22 08:03 KnutJaegersberg

Hi @KnutJaegersberg! Can you share some papers with me to consider if this is possible? Many thanks

oborchers avatar Apr 09 '22 15:04 oborchers

I read a few papers 2 weeks ago, but I can't recall them. As I get on the topic now, I found some interesting ones:

https://github.com/ikergarcia1996/MetaVec SOTA claim.

Also highly inspiring is the idea of radixAI, which reprojects fasttext embeddings into numberbatch knowledge graph space (solving OOV problem). My intuition is, with graph embeddings we can tackle bias in NLP better. https://radix.ai/blog/2021/3/a-guide-to-building-document-embeddings-part-1/ But they did not publish them.

KnutJaegersberg avatar Apr 12 '22 12:04 KnutJaegersberg