roberta topic
bertviz
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
UER-py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
roberta_zh
RoBERTa中文预训练模型: RoBERTa for Chinese
TransformerSum
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
TurboTransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
PhoBERT
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
DeBERTa
The implementation of DeBERTa
news-please
news-please - an integrated web crawler and information extractor for news that just works