bert-sklearn
bert-sklearn copied to clipboard
GPU memory leak ?
I use bert-sklearn in a benchmark scenario, so I repeatedly construct and use BertClassifiers, like this:
m1 = BertClassifier( bert_model="biobert-base-cased") m1.fit(..) m1.predict(..) m1.save(..)
....
m2 = BertClassifier( ) m2.fit(..) m2.predict(..) m2.save(..)
Doing so fails on using the second classifier with a "out of GPU memory" error. Executing the code with only one model at a time works.
So I suppose there is a GPU memory leak somewhere. Or do I need to do something special to free memory ?
Hi there,
As it stands in your snippet now m1 is still hanging onto GPU memory. So what i would try is either:
- do a del m1 after the m1.save(..) to release GPU memory or
- If for some reason you absolutely need m1 around then push the BERT model back on to the cpu with a m1.model.to("cpu") after the m1.save(..)
Also I don't know if you are running in a jupyter or notebook or a script, but i have noticed the memory utilization to be better and more predictable in a script. Not sure if that applies to you or not.