byaldi icon indicating copy to clipboard operation
byaldi copied to clipboard

Model is not being offloaded from VRAM

Open nishithshowri006 opened this issue 1 year ago • 3 comments

I am trying to run the model in Jupyter notebook.

image

  1. In the above iteration I haven't initialized the model.

image 2. Now I run the cell the model is loaded and it is showing 6GB of vram occupied right.

image 3. Now when I run the cell again the vram usage is doubled. 4. In the consequent runs the model is not occupying more than 12GB but what's interesting thing I have observed is when I am running that inside a loop for suppose I want to create an Index for each file I have, I don't have any other option than do this but this is causing the model to give me vram issues. How do I remove them from vram, I tried torch cuda cache free, tried to delete the variable none isn't working for me. Can you please help or is there something I am doing wrongly ?

nishithshowri006 avatar Oct 08 '24 12:10 nishithshowri006

Could you provide your notebook as a Colab notebook so I can more easily reproduce the exact issue? Thank you!

bclavie avatar Nov 11 '24 07:11 bclavie

Hey colab notebook this is just a basic observation I had. You might have more understanding than me, I added comments in the notebook on what I observed.

nishithshowri006 avatar Nov 11 '24 08:11 nishithshowri006

I believe, I can help with this issue. Could you assign the issue to me?

DebopamParam avatar Nov 14 '24 05:11 DebopamParam