Pytorch-Memory-Utils
Pytorch-Memory-Utils copied to clipboard
The Total Used Memory stays unchanged among .py files
When I track the GPU usage, the used memory stays unchanged wherever I write the code gpu_tracker.track()
. Here is the output for one .py file:
GPU Memory Track | 03-Nov-20-21:34:31 | Total Used Memory:972.7 Mb
At flcore.servers.serveravg __init__: line 23 Total Used Memory:972.7 Mb
At flcore.servers.serveravg __init__: line 30 Total Used Memory:972.7 Mb
In other files, the outputs are still "927.7 Mb".
Total Used Memory is the peak of the memory usage. When you delete some tensors, PyTorch will not release the space to the device, until you call torch.cuda.empty_cache() like the example script