Various-Attention-mechanisms
Various-Attention-mechanisms copied to clipboard
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Various-Attention-mechanisms
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch & Tensorflow, Keras
Papers, research and study
Research Paper | Python Code |
---|---|
Paper | Code |
Luong attention and Bahdanau attention



working on an attetion module for tensorflow where you can just import the attention, check it out and contribute :
https://github.com/monk1337/Tensorflow-Attention-mechanisms
#Images source : http://cnyah.com/2017/08/01/attention-variants/