HeterSumGraph
HeterSumGraph copied to clipboard
Code for ACL2020 paper "Heterogeneous Graph Neural Networks for Extractive Document Summarization"
Rouge
@dqwang122 thank for greate repo! I test with multi-news datasets, i get score from evaluate.py, but when i run code, the score very difference with your paper score public. ||R1|R2|RL|...
您好,感谢您公开您的代码。代码实现似乎和我对GAT的理解不太相同,所以请教一下您。 在用GAT计算word和sentence之间的边注意力权重的时候,做法是将边的头结点和尾结点的embedding拼接起来再做变换,但是在您的实现中当结点是word时,传入的向量为【0,0,0,0,0,0,0,0】,不知道这里是您有意设计还是实现方面遗漏了? 感谢您的回复!
Hi Danqing, Thank you for sharing the clean and nice code. I would like to know why Rouge-L are much higher than the results from other papers? Is it becasue...
@brxx122 您好: 下载了您提供的cnn的数据集,采用运行命令 python train.py --cuda --gpu 0 --data_dir ./data/middledata_2/ --cache_dir ./cache/cnn --embedding_path ./embedding_dir/glove.42B.300d.txt --model HSG --save_root ./data/model_path --log_root ./log --lr_descent --grad_clip -m 3,其他的都没有改过,但是提示sh的keyerror错误,找了很久没有发现错误是什么原因导致的,是否可以帮忙解答下。报错详细信息如下: result = self.forward(*input, **kwargs) File "/data/cxx/program/extractivemethod/heterogeneousgraph/module/GATLayer.py",...
it seems that in original cnndm dataset, there is no filed named **label**. How to get the corresponding label for each document thx :) 
I have a problem with pretrain model : Using backend: pytorch 2021-05-17 04:28:43,255 INFO : Pytorch 1.8.1+cu101 2021-05-17 04:28:43,256 INFO : [INFO] Create Vocab, vocab path is /content/drive/MyDrive/HeterSumGraph/cache/MultiNews/vocab 2021-05-17 04:28:43,310...
Hi Thanks for your contribution I'm looking for a multi-text summary with a dataset prepared by myself Can you provide more details on how to create the input dataset? thank...
Hi, Since constructing my own data set, I want to confirm the number here indicates which sentence of the text is used as the summary, right? 
I try the links you provide that "NYT(The New York Times Annotated Corpus) can only be available from [LDC](https://catalog.ldc.upenn.edu/LDC2008T19). And we follow the [preprocessing code](https://github.com/gregdurrett/berkeley-doc-summarizer) of Durrett et al. (2016)...