CSA-inpainting
CSA-inpainting copied to clipboard
why do you say #the batchsize Need to be set to 1 in training process ?
when the batchsize is 1 it just use 30% GPU memory in 2080Ti( similar in 1080Ti)
and even I set batchsize as 6 it just use 5385 MB (which I don't understand)
is there any reason ?
Due to the limitation of the experimental environment, I set this parameter to 1 so that multiple sets of comparative experiments can be run. If you want to set it larger, you can change the code details。
@wode-czw ,Do you change the batchsize?, How do you change other code details?
@KumapowerLIU In my experiments, when setting the batch size to 16, the GPU memory increases to 84% (9203MiB) in RTX 2080, but it seems that the GPU utilization is still very low (less than 30%, as same as the batch size setting in 1). Is it possible that the searching and generating process in 32x32 feature maps cause that?
@wode-czw how do u solve the problem u descript?
I think so like that. If when we use a 1080ti GPU for training, the GPU can allocates memories of 11GBytes. So that it is sufficient to train with batch size of 4 over. Why didn't you try?