KPConv-PyTorch
KPConv-PyTorch copied to clipboard
Why the cuda memory always change?
Why the cuda memory always change?Do you have any way to pin the memory size?The server I use is public, and when others run the program, my program will run out of memory.
Hi @luzonghao1,
The reason the memory changes is because a lot of dimensions are variable, mainly the number of points in each batch. You can change a simple line so that the GPU always keeps memory allocated even if the next batch has fewer points. Just comment this line 👍 https://github.com/HuguesTHOMAS/KPConv-PyTorch/blob/73e444d486cd6cb56122c3dd410e51c734064cfe/utils/trainer.py#L203
An I think that should do it