torchdiffeq icon indicating copy to clipboard operation
torchdiffeq copied to clipboard

Bug: Memory Leaky with from torchdiffeq import odeint

Open wangmiaowei opened this issue 1 year ago • 2 comments

When I use autograd, I found there is memory leak bug. Even with

gc.collect()
torch.cuda.empty_cache() 

The allocated memory is still growing with the increments of iteration.

logging.info("memory_allcoated(MB) {}".format(torch.cuda.memory_allocated()/1048576))

I believe you have noticed the similar bug, could you provide me some solutions?

wangmiaowei avatar Feb 26 '24 17:02 wangmiaowei

Do you have a minimal working example, and your PyTorch version?

rtqichen avatar Mar 12 '24 02:03 rtqichen