self-attentive-rnn topic
List
self-attentive-rnn repositories
neat-vision
249
Stars
24
Forks
Watchers
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Structured-Self-Attention
494
Stars
110
Forks
Watchers
A Structured Self-attentive Sentence Embedding
Self-Attentive-tensorflow
193
Stars
39
Forks
Watchers
Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
Structured-Self-Attentive-Sentence-Embedding
24
Stars
1
Forks
Watchers
Re-Implementation of "A Structured Self-Attentive Sentence Embedding" by Lin et al., 2017