gpt-2-Pytorch
gpt-2-Pytorch copied to clipboard
Help Increasing the amount of training/fine-tuning text to about 10k words
Hello, I am trying to train/fine-tune the GPT-2 model using your wrapper, I have successfully made it to train by using a text file, however I would like to train the model with lots of text like 10 thousand words on a specific topic/domain and have it generate from 500-1000 words but I keep getting a strange error when I try it. Please how do I increase the amount of training/fine-tuning text from the current limit to about 10,000 words?