roberta-model topic

List roberta-model repositories

COSINE

199
Stars
25
Forks
Watchers

[NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self-Training Approach'.

model-zoo

44
Stars
1
Forks
Watchers

NLP model zoo for Russian

CICERO

61
Stars
6
Forks
Watchers

The purpose of this repository is to introduce new dialogue-level commonsense inference datasets and tasks. We chose dialogues as the data source because dialogues are known to be complex and rich in...

ClusterTransformer

39
Stars
13
Forks
Watchers

Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.

BabyBERTa

15
Stars
7
Forks
Watchers

Source code for CoNLL 2021 paper by Huebner et al. 2021

Transformers-for-NLP-2nd-Edition

655
Stars
264
Forks
Watchers

Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning, training, and prompt engineering examples. A bonus section with ChatGPT, GPT-3.5-turbo, GPT-4, and DALL-E...

AI_Generated_Text_Checker_App

22
Stars
12
Forks
Watchers

This app Classifies the text generated by AI tools like chatGPT. Roberta-base-openai-detector Model has been used from hugging face to detect ai generated texts.

Long-texts-Sentiment-Analysis-RoBERTa

24
Stars
7
Forks
Watchers

PyTorch implementation of Sentiment Analysis of the long texts written in Serbian language (which is underused language) using pretrained Multilingual RoBERTa based model (XLM-R) on the small dataset.

Twitter-Sentiment-Analysis-RoBERTa

20
Stars
1
Forks
Watchers

Sentiment Analysis of tweets written in underused Slavic languages (Serbian, Bosnian and Croatian) using pretrained multilingual RoBERTa based model XLM-R on 2 different datasets.

A quantitative study on over 1.25 million tweets about ChatGPT, employed data scrapping, data cleaning, EDA, topic modeling, and sentiment analysis.