openai-function-tokens
openai-function-tokens copied to clipboard
Predict the exact openai token usage of functions
OpenAI Function Tokens Estimator
Estimate OpenAI token usage for chat completions, including functions, with this Python utility!
This package is based upon hmarr's openai-chat-tokens. As of right now (September 2023) there is no official documentation from openai on how to accurately predict the number of tokens from functions. This package solves that! Use it to get a very precise estimation of the token count for chat completions and better manage your OpenAI API usage.
Most often it is correct down to the token.
Installation
-
Install the Package via pip
pip install openai-function-tokens -
Import the Estimation Function
from openai_function_tokens import estimate_tokens
Usage
To use the estimator, call the estimate_tokens function:
estimate_tokens(messages, functions=None, function_call=None)
Pass in the messages, and optionally functions and function_call, to receive a precise token count.
Acknowledgments
Credit to hmarr for the original TypeScript tool. For a better understanding of token counting logic, check out his blog post.
Further Reading
How to call functions with chat models
How to use functions with a knowledge base
Counting tokens (only messages)
Contributing
Feedback, suggestions, and contributions are highly appreciated. Help make this tool even better!