bilstm-attention topic

List bilstm-attention repositories

SST-2-sentiment-analysis

82
Stars
16
Forks
Watchers

Use BiLSTM_attention, BERT, ALBERT, RoBERTa, XLNet model to classify the SST-2 data set based on pytorch

ChineseNRE

737
Stars
177
Forks
Watchers

中文实体关系抽取,pytorch,bilstm+attention

nlp-notebook

485
Stars
106
Forks
Watchers

NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。

NLPGNN

331
Stars
64
Forks
Watchers

1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.

Text-Classification-PyTorch

61
Stars
11
Forks
Watchers

Implementation of papers for text classification task on SST-1/SST-2

Text-Classification

135
Stars
29
Forks
Watchers

PyTorch implementation of some text classification models (HAN, fastText, BiLSTM-Attention, TextCNN, Transformer) | 文本分类

multi-head-attention-labeller

16
Stars
3
Forks
Watchers

Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.