Restormer icon indicating copy to clipboard operation
Restormer copied to clipboard

Typical GPU memory requirements for training?

Open wonwoolee opened this issue 3 years ago • 2 comments
trafficstars

I was trying to run training Restormer, and succeed to run it with 128x128 size.

However my GPU memory runs out when trying to train the network with 256x256 size and a batch size larger than 2. My GPU is RTX3080 with 10GB memory.

Do you know how much memory we need to train it on 256x256 size patch and batch size >= 8 ?

wonwoolee avatar Aug 26 '22 05:08 wonwoolee

Hi @wonwoolee

We train our Restorer with 8 Tesla V100s, each with 32GB memory. For patch size of 256, we use batch size 2.

mini_batch_sizes: [8,5,4,2,1,1]             # Batch size per gpu   
gt_sizes: [128,160,192,256,320,384]  # Patch sizes for progressive training.

Thanks

adityac8 avatar Sep 12 '22 19:09 adityac8

Hi @wonwoolee

We train our Restorer with 8 Tesla V100s, each with 32GB memory. For patch size of 256, we use batch size 2.

mini_batch_sizes: [8,5,4,2,1,1]             # Batch size per gpu   
gt_sizes: [128,160,192,256,320,384]  # Patch sizes for progressive training.

Thanks

May I know how long you train the Restomer with 8*32G V100 gpus?

Amazingren avatar Sep 17 '22 08:09 Amazingren

Hi

We train our model for 300000 iterations. The details can be found here.

# training settings
train:
  total_iter: 300000

Thanks

adityac8 avatar Sep 26 '22 15:09 adityac8