bert-sklearn icon indicating copy to clipboard operation
bert-sklearn copied to clipboard

GPU memory leak ?

Open behrica opened this issue 5 years ago • 2 comments

I use bert-sklearn in a benchmark scenario, so I repeatedly construct and use BertClassifiers, like this:

m1 = BertClassifier( bert_model="biobert-base-cased") m1.fit(..) m1.predict(..) m1.save(..)

....

m2 = BertClassifier( ) m2.fit(..) m2.predict(..) m2.save(..)

Doing so fails on using the second classifier with a "out of GPU memory" error. Executing the code with only one model at a time works.

So I suppose there is a GPU memory leak somewhere. Or do I need to do something special to free memory ?

behrica avatar Nov 07 '19 08:11 behrica

Hi there,

As it stands in your snippet now m1 is still hanging onto GPU memory. So what i would try is either:

  1. do a del m1 after the m1.save(..) to release GPU memory or
  2. If for some reason you absolutely need m1 around then push the BERT model back on to the cpu with a m1.model.to("cpu") after the m1.save(..)

charles9n avatar Nov 09 '19 03:11 charles9n

Also I don't know if you are running in a jupyter or notebook or a script, but i have noticed the memory utilization to be better and more predictable in a script. Not sure if that applies to you or not.

charles9n avatar Nov 09 '19 03:11 charles9n