creative-writing-with-gpt2
creative-writing-with-gpt2 copied to clipboard
Single model
https://www.reddit.com/r/MachineLearning/comments/ebf7ek/p_i_gave_a_workshop_on_writing_creatively_with/
One suggestion: instead of training separate finetuned models like one for Alan Watts, the Bible, Tolkien, etc, train one model on a dataset with inline metadata to let you control the author in the prompt. This work well with char-RNNs, GPT-2, CTRL, etc. This reduces bandwidth/storage, makes life easier for the user (want to work with a different author? Just change your prompt, don't download a whole new model and start a new session), saves compute during finetuning, possibly better quality results if authors are similar, and scales to as many authors as you want to support.
My understanding is that this would mean creating a dataset:
AUTH=alan-watts,TEXT=text to train on,
AUTH=bible,TEXT=more text to train on`