xlnet
xlnet copied to clipboard
Fine-tuning on text generation tasks ?
Did someone try to fine-tune XLNet on a text generation task ?
Any script example ? How are the results ?
Yes https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e
Thanks for the answer @edanweis .
However this is not a finetuned example : the author just took a pretrained checkpoint and generate text from this, without finetuning. That's one of the reason why they had to generate tokens one by one, which is very slow.
If we could finetune it on downstream task, maybe the model could learn to generate more token at a time.
@kimiyoung any update on this ?