t5 topic
t2t-tuner
Convenient Text-to-Text Training for Transformers
RATransformers
RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!
T5-Text-to-Text-Transfer-Transformer
Demo of the T5 model for various pre-trained task.
JointGT
Codes for our paper "JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs" (ACL 2021 Findings)
turkish-question-generation
Automated question generation and question answering from Turkish texts using text-to-text transformers
few-shot-lm
The source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)
bert_seq2seq_DDP
bert_seq2seq的DDP版本,支持bert、roberta、nezha、t5、gpt2等模型,支持seq2seq、ner、关系抽取等任务,无需添加额外代码,轻松启动DDP多卡训练。
Leaf-Question-Generation
Easy to use and understand multiple-choice question generation algorithm using T5 Transformers.
ttt
A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+