Cambridge Language Technology Lab
Cambridge Language Technology Lab
mirror-bert
[EMNLP'21] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.
sapbert
[NAACL'21 & ACL'21] SapBERT: Self-alignment pretraining for BERT & XL-BEL: Cross-Lingual Biomedical Entity Linking.
cometa
Corpus of Online Medical EnTities: the cometA corpus
ContrastiveBLI
Improving Word Translation via Two-Stage Contrastive Learning (ACL 2022). Keywords: Bilingual Lexicon Induction, Word Translation, Cross-Lingual Word Embeddings.
eva
[AAAI'21] Code release for "Visual Pivoting for (Unsupervised) Entity Alignment".
mop
Codes for paper: Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT