torchdiffeq icon indicating copy to clipboard operation
torchdiffeq copied to clipboard

Multi-GPU Memory Leak

Open ray8828 opened this issue 4 years ago • 2 comments

Hi, thanks for sharing your great work!

I met memory leak when I was using multi-gpu: the gpu memory in the first card stays constant, but the memory consuming on all other gpus gradually increases and finally gets to an out of memory. I'm using odeint_adjoint with rk4. Is there any instruction to this problem? Thanks!

ray8828 avatar Jan 03 '21 15:01 ray8828

to some reason, the state I sent to the ODE is about 786,432 dim

ray8828 avatar Jan 06 '21 12:01 ray8828

If you have a minimal working example that can reproduce the memory leak, I can take a look at that. I'm not aware of memory leaks like the one you're describing..

rtqichen avatar Jan 06 '21 18:01 rtqichen