gpt-tokenizer
gpt-tokenizer copied to clipboard
Huge memory consumsation of isWithinTokenLimit
I am experiencing 200MB increase after implementing gpt-tokenizer
the only function that I am using from this library is isWithinTokenLimit
. Here is an image of my memory consumtion before and after deployment.
Here is how I am using is
function getRequestTokenCount(req: ChatCompletionRequestMessage[]) {
const extraTokensDueToPromptForEachMessage = 7
return req.reduce((acc, curr) => {
const tokensInText = isWithinTokenLimit(curr.content, Infinity) || 99999
return acc + tokensInText + extraTokensDueToPromptForEachMessage
}, 0)
}