Pytorch-Memory-Utils icon indicating copy to clipboard operation
Pytorch-Memory-Utils copied to clipboard

memory info doesn't match

Open njuhang opened this issue 3 years ago • 3 comments

The previous display total used memory plus the middle tensor memory does not equal the following total used memory

njuhang avatar Apr 16 '21 08:04 njuhang

the same

gongjingcs avatar Apr 26 '21 11:04 gongjingcs

image image I define a tensor with size [6, 12,2048,2048], the fp32 memory consumes 1207.9 M, howerver line 13 shows Total Used Memory:2511.9 Mb

gongjingcs avatar Apr 26 '21 11:04 gongjingcs

Hello. I just answer the question in my PR. It is because the cuda kernel take some space.

If you are interested, you can see the revised code here:

https://github.com/hzhwcmhf/Pytorch-Memory-Utils/blob/master/README.md#faqs

Why Total Tensor Used Memory is much smaller than Total Allocated Memory?

* Total Allocated Memory is the peak of the memory usage. When you delete some tensors, PyTorch will not release the space to the device, until you call gpu_tracker.clear_cache() like our sciprts.

* The cuda kernel will take some space. See https://github.com/pytorch/pytorch/issues/12873

hzhwcmhf avatar Apr 30 '21 13:04 hzhwcmhf