torchdiffeq
torchdiffeq copied to clipboard
Multi-GPU Memory Leak
Hi, thanks for sharing your great work!
I met memory leak when I was using multi-gpu: the gpu memory in the first card stays constant, but the memory consuming on all other gpus gradually increases and finally gets to an out of memory. I'm using odeint_adjoint with rk4. Is there any instruction to this problem? Thanks!
to some reason, the state I sent to the ODE is about 786,432 dim
If you have a minimal working example that can reproduce the memory leak, I can take a look at that. I'm not aware of memory leaks like the one you're describing..