Sequence-to-Sequence-and-Attention-from-scratch-using-Tensorflow
Sequence-to-Sequence-and-Attention-from-scratch-using-Tensorflow copied to clipboard
Sequence to Sequence and attention from scratch using Tensorflow
Results
2
Sequence-to-Sequence-and-Attention-from-scratch-using-Tensorflow issues
Sort by
recently updated
recently updated
newest added
my sequences are like this input_seq = [3,5,4,1,6,3] output_seq = [4,2,5,3,6] input_seq is of variable length I am not getting how to edit your code for my purpose
Thanks for the clean and nice to read code! I think there might be a bug in the soft attention module: ``` eij=tf.tanh(unrol_states) #Softmax across the unrolling dimension softmax=tf.nn.softmax(eij,dim=1) context=tf.reduce_sum(tf.multiply(softmax,unrol_states),axis=1)...