CommonsenseStoryGen
CommonsenseStoryGen copied to clipboard
Implementation for paper "A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation"
Hi, Thank you for sharing this interesting work. I am trying to use your code to fine tune the GPT2 model with the knowledge sentences you provided, but meet a...
Or the code to "transform the commonsense triples in ConceptNet and ATOMIC into readable natural language sentences using a template-based method"? Find your work really interesting. Thanks.
Hi, Could you please share the code for preprocessing of the dataset which performs the delexilization? This is to apply your approach to another dataset. Thank you!
In your paper, you mentioned that" introduce the knowledge to the pretrained language model by **post-training** on knowledge-augmented data." In my opinion, **post-train** is different from finetune. According to paper...
Would you be willing to share the evaluation code for calculating perplexity, BLEU scores, coverage, BR/LR? Thanks!
Any chance you could share the gold pretrained model & ablations weights from the paper? Training to replicate the paper is quite slow on my machine.