HierarchicalAttentionNetworks icon indicating copy to clipboard operation
HierarchicalAttentionNetworks copied to clipboard

Hierarchical Attention Networks for Document Classification in Keras

Results 3 HierarchicalAttentionNetworks issues
Sort by recently updated
recently updated
newest added

Trying to run the network in Google Colab, throws an error due to the mask dimensions: Running it on the computer returns ValueError: Dimensions must be equal, but are 15...

I tried to return the alphas (the attentions) in the class AttLayer(Layer), but Keras gets the error "you must feed value to sentence_input", Can you help me fix this problem?

Looking at the tensorflow code, it seems to me that tf version has only one bilstm encoder ( only for words). Is that correct implementation?