TAKG
TAKG copied to clipboard
The official implementation of ACL 2019 paper "Topic-Aware Neural Keyphrase Generation for Social Media Language"
hi, thanks for your sharing. the code is high-quality and clear. But when I attempt to use your model with my own dataset, I find the ntm_loss is very large....
How can i get the dataset of twitter?
稀疏问题咨询
大佬好,有一个疑问,我们的模型中x_bow输入,长度对应整个词表,会很稀疏。这一块就是直接输入的吗?有没有tricks处理这一块呢?稀疏的影响大不?
非常感谢你们能提供一个这么棒的模型!非常抱歉做打扰了,我们想对google搜索得到的网页进行一个关键短语的提取,想请教一下微博的训练集,其中的关键短语是怎么标注得到的,能否告知一下你们的标注流程或者关键短语获取方法,谢谢!
Hi, I see we can train in One2one and evaluate in one2many mode. How can I train in one2many mode? It seems lots of codes need to be changed for...
When I use pred_evaluate.py to evaluate a prediction ,there is something wrong. The code:python pred_evaluate.py -pred pred\predict__Weibo_s100_t10.copy.seed9527.emb150.vs50000.dec300.20200517-170932__e4.val_loss=5.464.model-0h-03m/predictions.txt -src data/Weibo/test_src.txt -trg data/Weibo/test_trg.txt The error:UnicodeDecodeError: 'gbk' codec can't decode byte 0xb4 in...