acnn icon indicating copy to clipboard operation
acnn copied to clipboard

Relation Classification via Multi-Level Attention CNNs

Results 8 acnn issues
Sort by recently updated
recently updated
newest added

please introduce simplely this dataset ,include the every number in front of the sentence

你好,复现代码里的这个senna embedding是怎么得到的啊?

你这个是可以实现的版本吗?使用tensorflow?

I think the attention of the pooling layer tries to get the relations between the i-th word and j-th class, so the U should be (d^context(after conv), d^class_embedding), but the...

hello,I choose attention_pooling model when I run main.py. But I see a great difference from your results.And the train accuracy is so high, the test accuracy is so low. My...

Hi, In function `pos` in file `utils.py`, relative distance of words is mapped to [0,123). Why is 123 chosen? Is it related to the maximum sentence length of the data?...

I noticed that in the log file, the version "baseline+attentive pooling" can get the result: 05-10 21:12 Epoch: 21 Train: 94.81% Test: 75.19%. What are the model configurations in details...

Hi, my implementation is similar as yours. In input attention layer, I did convolution of kernel size 3 first and then multiply with the attention. I didn't see a mathematical...