gpt-2-simple icon indicating copy to clipboard operation
gpt-2-simple copied to clipboard

Bug when loading 1558M for finetuning

Open Rowlando13 opened this issue 4 years ago • 1 comments

This is my first issue so sorry for the faux pas. 1558M fails to load for finetuning. I have access to a p100 and 30 GB of ram. I don't get any OOM warnings or see my ram usage jump up. If I wait, the code eventually becomes unresponsive. It is before line 256 print('Loading checkpoint', ckpt) because it never prints this.

Rowlando13 avatar Aug 02 '20 23:08 Rowlando13

I have isolated the bug to line 244 sess.run(tf.compat.v1.global_variables_initializer())

Rowlando13 avatar Aug 03 '20 19:08 Rowlando13