t5-model topic
bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
simpleT5
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
Finetune-Transformers
Abstractive text summarization by fine-tuning seq2seq models.
nlp-models-examples
Examples of inference and fine-tuning T5, GPT-2 and ruGPT-3 models
it5
Materials for "IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation" 🇮🇹
Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
OpenLLM
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
PromptCLUE
PromptCLUE, 全中文任务支持零样本学习模型
t5-encoder
A extension of Transformers library to include T5ForSequenceClassification class.