gpt-tokenizer icon indicating copy to clipboard operation
gpt-tokenizer copied to clipboard

avoid expensive initialization

Open mimoo opened this issue 1 year ago • 4 comments

Hello,

I'm using the following:

import { encode, isWithinTokenLimit } from 'gpt-tokenizer/model/text-davinci-003';

which seems to slow down the initialization, enough that I can't deploy to cloudflare workers with this library. Is there a way to lazily initialize things?

mimoo avatar Jun 20 '23 00:06 mimoo