IA-SSD icon indicating copy to clipboard operation
IA-SSD copied to clipboard

about gpu memory

Open mc171819 opened this issue 2 years ago • 2 comments

Hi, I also wonder the memory in your table6. I set batchsize=1 when inference in V100, but I get the memory is 1800+M, how can i get the 102M or near 120M result?

mc171819 avatar Apr 14 '22 03:04 mc171819

The memory footprint here is calculated by averaging since our work can process 100 frames in parallel on an RTX2080Ti (11GB total except for the initial memory 800 MB for Pytorch 1.1 version). Thus It can be roughly calculated by (11000-800)/100 = 102.

yifanzhang713 avatar Apr 16 '22 17:04 yifanzhang713

Hello, thanks for your work, I would like to ask how to process 100 frames in parallel, which means batch_size can be set to 100 during inference?

Yzichen avatar Apr 26 '22 08:04 Yzichen