gputools icon indicating copy to clipboard operation
gputools copied to clipboard

How to release GPU memory after computing?

Open sunshichen opened this issue 4 years ago • 3 comments

I'm using gputools to do ndarray smoothing. After finished processing, I found the GPU still hold hundreds of MB memory. How can I release that part of memory appropriately?

Sample codes:

from gputools.convolve import median_filter
import numpy as np

array = np.random.randint(0, 2, (128, 128, 128))
smoothed_array = mediean_filter(array, size=5)

sunshichen avatar Jan 12 '21 07:01 sunshichen

Hi, I'm having the same issue. Can you please comment on how to release memory so that I can process batches serially?

parthnatekar avatar Feb 18 '22 20:02 parthnatekar

Hi,

Sorry not having replied to this issue.

from gputools.convolve import median_filter
import numpy as np

array = np.random.randint(0, 2, (128, 128, 128))
smoothed_array = median_filter(array, size=5)

I can't reproduce this. After median_filter runs, all extra memory on the GPU is properly released and overal GPU memory consumption is as before.

Hi, I'm having the same issue. Can you please comment on how to release memory so that I can process batches serially?

pyopencl will normally release the GPU memory associated to any intermediate array at the time it leaves its scope.

E.g. when allocating an array:

from gputools import OCLArray
x = OCLArray.zeros((1024,)*3)

you should see GPU memory going up, but then when deleting the array

del x

it should go back.

Hope that helps,

Martin

maweigert avatar Feb 19 '22 23:02 maweigert

Hi,

Thanks for your response.

I'm doing alternating deskew and deconvolution operations; the deskew uses gpu-tools and the deconvolution uses a Tensorflow backend.

It seems that the issue arises when gpu-tools runs after a Tensorflow iteration, even when I manually clear the Tensorflow graph.

Any thoughts?

parthnatekar avatar Feb 21 '22 21:02 parthnatekar