aitextgen
aitextgen copied to clipboard
Investigate using Hugging Face Trainer as primary model trainer
When I first built aitextgen a year ago, training with base Transformers was a pain, which is why I looked into pytorch-lightning. However, Hugging Face has put a lot more work into improving Trainer with the most recent iteration seems to have roughly feature parity with the subset of pytorch-lightning I had used.
Since native integration has a number of helpful features (including better TPU/DeepSpeed ZeRo support), I will look into using native Trainer. This will not remove pytorch-lightning support at the moment.