bilstm-attention topic
SST-2-sentiment-analysis
Use BiLSTM_attention, BERT, ALBERT, RoBERTa, XLNet model to classify the SST-2 data set based on pytorch
ChineseNRE
中文实体关系抽取,pytorch,bilstm+attention
nlp-notebook
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
NLPGNN
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
Text-Classification-PyTorch
Implementation of papers for text classification task on SST-1/SST-2
Text-Classification
PyTorch implementation of some text classification models (HAN, fastText, BiLSTM-Attention, TextCNN, Transformer) | 文本分类
multi-head-attention-labeller
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.