roberta topic
CLUEPretrainedModels
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
CLUECorpus2020
Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料
japanese-pretrained-models
Code for producing Japanese pretrained models provided by rinna Co., Ltd.
awesome-pretrained-chinese-nlp-models
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
happy-transformer
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
BOND
BOND: BERT-Assisted Open-Domain Name Entity Recognition with Distant Supervision
xlnet_zh
中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large
COSINE
[NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self-Training Approach'.
Getting-Started-with-Google-BERT
Build and train state-of-the-art natural language processing models using BERT
RECCON
This repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.