gpt-tokenizer icon indicating copy to clipboard operation
gpt-tokenizer copied to clipboard

Huge memory consumsation of isWithinTokenLimit

Open aminsol opened this issue 8 months ago • 0 comments

I am experiencing 200MB increase after implementing gpt-tokenizer the only function that I am using from this library is isWithinTokenLimit. Here is an image of my memory consumtion before and after deployment. Here is how I am using is

function getRequestTokenCount(req: ChatCompletionRequestMessage[]) {
  const extraTokensDueToPromptForEachMessage = 7
  return req.reduce((acc, curr) => {
    const tokensInText = isWithinTokenLimit(curr.content, Infinity) || 99999
    return acc + tokensInText + extraTokensDueToPromptForEachMessage
  }, 0)
}

image

aminsol avatar Oct 10 '23 04:10 aminsol