gpt-2-Pytorch
gpt-2-Pytorch copied to clipboard
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Train
training
Is there's any way to train GPT2 using my own text corpus?
It needs these packages as well so I guess they need to go into requirements.txt: torch tqdm
Very interesting
I have been using an implementation of GPT-2 from your repository and noticed that the size of the smallest GPT-2 model available in the repository differs from the smallest model...
Thank you for this project! It is very helpful for me to understand how GPT2 synthesize text. I also noticed that the [`GPT2/encoder.py`](https://github.com/graykode/gpt-2-Pytorch/blob/master/GPT2/encoder.py) does not implement the capability of recognizing...
"Hi, I am reading the GPT-2 paper and encountering a problem with the following phrase related to implementation: 'A modified initialization method is used to account for the accumulation on...
# Add Python Testing Infrastructure ## Summary This PR establishes a comprehensive testing infrastructure for the GPT-2 PyTorch implementation project using Poetry as the package manager and pytest as the...