agentic icon indicating copy to clipboard operation
agentic copied to clipboard

Fix prompt length calculation

Open alxmiron opened this issue 1 year ago • 4 comments

Current _buildMessages() calculates tokens in prompt in a wrong way - there's a small difference with a usage amount that comes from ChatGPT (in non-stream mode). I fix it, picking the logic from here Now our estimated numTokens should be equal to message.detail.usage.prompt_tokens

alxmiron avatar Apr 21 '23 17:04 alxmiron

Better to check the model and then calc. gpt3.5 and 4 seems calc differently.

zhujunsan avatar Apr 22 '23 00:04 zhujunsan

I also make the change in #546, but your code looks better 😀

zhujunsan avatar Apr 22 '23 00:04 zhujunsan

after test, at least in gpt-3.5, tokens_per_message is 5. I think it's one missing \n that's not calculated

zhujunsan avatar May 02 '23 03:05 zhujunsan

Is there anyone to merge or review this pr?

zhujunsan avatar Oct 30 '23 07:10 zhujunsan

This project is undergoing a major revamp; closing out old PRs as part of the prep process.

Sorry I never got around to reviewing this PR. The chatgpt package is pretty outdated at this point. I recommend that you use the openai package or the openai-fetch package.

transitive-bullshit avatar Jun 07 '24 05:06 transitive-bullshit