nanoGPT
nanoGPT copied to clipboard
finetuning - Out of Memory Error
I tried to run the finetuning by
$ python train.py config/finetune_shakespeare.py
Seems like finetune requires a lot of memory. Is there anyway to lower the memory requirement. The batch size is already 1.
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 100.00 MiB (GPU 0; 23.68 GiB total capacity; 21.67 GiB already allocated; 85.06 MiB free; 21.92 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Your GPU memory is too small, suggest to switch to CPU.
For test, you may use smaller model (gpt2-large).