Restormer
Restormer copied to clipboard
Typical GPU memory requirements for training?
I was trying to run training Restormer, and succeed to run it with 128x128 size.
However my GPU memory runs out when trying to train the network with 256x256 size and a batch size larger than 2. My GPU is RTX3080 with 10GB memory.
Do you know how much memory we need to train it on 256x256 size patch and batch size >= 8 ?
Hi @wonwoolee
We train our Restorer with 8 Tesla V100s, each with 32GB memory. For patch size of 256, we use batch size 2.
mini_batch_sizes: [8,5,4,2,1,1] # Batch size per gpu
gt_sizes: [128,160,192,256,320,384] # Patch sizes for progressive training.
Thanks
Hi @wonwoolee
We train our Restorer with 8 Tesla V100s, each with 32GB memory. For patch size of 256, we use batch size 2.
mini_batch_sizes: [8,5,4,2,1,1] # Batch size per gpu gt_sizes: [128,160,192,256,320,384] # Patch sizes for progressive training.Thanks
May I know how long you train the Restomer with 8*32G V100 gpus?
Hi
We train our model for 300000 iterations. The details can be found here.
# training settings
train:
total_iter: 300000
Thanks