CLIP icon indicating copy to clipboard operation
CLIP copied to clipboard

empty every clip related data from VRAM

Open crasse2 opened this issue 3 years ago • 2 comments

Hi There !

let's say I load your model to the GPU using those commands :

def clip_loader(clip_model):
    perceptor, preprocess = clip.load(clip_model, jit=False)
    return perceptor.eval()

perceptor = clip_loader(clip_model)

it load around 1 GB of data to the cuda device. How would I do to totally free those 1 GB ?

I tried

del perceptor
torch.cuda.empty_cache()
gc.collect()

but it only remove around 300 MB of data (and so 700 MB are still occupying the GPU ram) is it possible to remove everything with a certain python command ?

crasse2 avatar Nov 30 '21 21:11 crasse2

There could be other things remaining on the VRAM that we can't fully control, such as optimized CUDA kernels. If you absolutely need to release the VRAM, I'd recommend running your job on a subprocess, so that it can release all VRAM allocations upon exit.

jongwook avatar Dec 03 '21 03:12 jongwook

Hi ! thanks for your answer !

yeah I was considering subprocess/external call or right now i'm trying to flush everything by resetting the device with numba/cuda (didn't found out how to restart and repopulate the cuda device after resetting it within a notebook)

anyway, thanks for sharing the clip model, it's a trully amazing one !

crasse2 avatar Dec 03 '21 13:12 crasse2