CLIP
CLIP copied to clipboard
empty every clip related data from VRAM
Hi There !
let's say I load your model to the GPU using those commands :
def clip_loader(clip_model):
perceptor, preprocess = clip.load(clip_model, jit=False)
return perceptor.eval()
perceptor = clip_loader(clip_model)
it load around 1 GB of data to the cuda device. How would I do to totally free those 1 GB ?
I tried
del perceptor
torch.cuda.empty_cache()
gc.collect()
but it only remove around 300 MB of data (and so 700 MB are still occupying the GPU ram) is it possible to remove everything with a certain python command ?
There could be other things remaining on the VRAM that we can't fully control, such as optimized CUDA kernels. If you absolutely need to release the VRAM, I'd recommend running your job on a subprocess, so that it can release all VRAM allocations upon exit.
Hi ! thanks for your answer !
yeah I was considering subprocess/external call or right now i'm trying to flush everything by resetting the device with numba/cuda (didn't found out how to restart and repopulate the cuda device after resetting it within a notebook)
anyway, thanks for sharing the clip model, it's a trully amazing one !