gpt-2-simple icon indicating copy to clipboard operation
gpt-2-simple copied to clipboard

Generating a small amount of tokens takes just as long as generating a whole bunch

Open flarn2006 opened this issue 4 years ago • 1 comments

I'm experimenting with a custom-fine-tuned model on Train a GPT-2 Text-Generating Model w/ GPU For Free. I was hoping to run it in a similar way to how Write With Transformer works, where I can quickly generate small sets of short completions and hand-pick them. But I'm running into a problem where even if I set the length= parameter to something really low like 15, it still appears to take just as long as if I use the default of 1023. Is there something else I need to do?

flarn2006 avatar May 17 '20 15:05 flarn2006

Did you make any progress? I've started a similar path of investigation/benchmarking, in attempt to speed up smaller lengths.

greenbeauty avatar Feb 17 '21 15:02 greenbeauty