distilbert topic
bert-in-production
A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.
SentimentAnalysis
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
TransformerSum
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
transformers-ner
Pytorch-Named-Entity-Recognition-with-transformers
transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks
turkish-bert
Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
Getting-Started-with-Google-BERT
Build and train state-of-the-art natural language processing models using BERT
dialog-nlu
Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU
bert-distillation
Distillation of BERT model with catalyst framework
ernie
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.