pretrained-language-model topic
BERT4ETH
BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection (WWW23)
COCO-DR
[EMNLP 2022] This is the code repo for our EMNLP‘22 paper "COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning".
CGExpan
The source code used for paper "Empower Entity Set Expansion via Language Model Probing", published in ACL 2020.
AiSpace
AiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
negation-learning
code for our paper "Understanding by Understanding Not: Modeling Negation in Language Models"
cocosum
:coconut: Code & Data for Comparative Opinion Summarization via Collaborative Decoding (Iso et al; Findings of ACL 2022)
awesome-instruction-learning
Papers and Datasets on Instruction Tuning and Following. ✨✨✨
Prompt-Transferability
On Transferability of Prompt Tuning for Natural Language Processing
AMOS
[ICLR 2022] Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators
unify-learning-paradigms
data collator for UL2 and U-PaLM