self-instruct
self-instruct copied to clipboard
Aligning pretrained language models with instruction data generated by themselves.
Sorry, I may not look into your codes carefully, but could you please show me where you put the ROUGE-L implementation? Thanks.
Any Reason why using 6 human-written and 2 model-generation instruction as context? Does these 2 hyperparameter, and also diversity of sampling instruction type(like ner, classification, generation), make any difference?
As I understood figure 5 in your paper, you further fine-tuned GPT-SelfInstruct on the SuperNatural Instructions data and surprisingly the results got worse compared to the "vanilla" GPT-SelfInstruct. Is my...
At the end of 2023, gpt completions endpoint is deprecated with engine functionality and the models that supports that. Are you planning to update the repository? Because at the moment...