Sequence-to-Sequence-and-Attention-from-scratch-using-Tensorflow
Sequence-to-Sequence-and-Attention-from-scratch-using-Tensorflow copied to clipboard
Sequence to Sequence and attention from scratch using Tensorflow
Sequence to Sequence Model and Attention from Scratch
The goal is to not use any existing contrib library implementation of Tensorflow and build a sequence to sequence LSTM based model from scratch using Tensorflow. Probably only thing not from scratch in this project is backpropagation.
A Neural Attention mechanism was also implemented and the results were compared. This project was done for learning purpose for the course Deep Learning by Google (Udacity)