openai-cookbook
openai-cookbook copied to clipboard
[SUPPORT]How to count tokens when I use gpt-4o
trafficstars
When I use a Chinese prompt to call GPT-4, the actual number of tokens used by the model is less than that of GPT-4 Turbo, which is expected. But how can I calculate the actual number of tokens consumed when calling GPT-4?
Also want to know. Thanks!
ChatGPT's answer:
import tiktoken
# Select the encoding corresponding to the model
encoding = tiktoken.encoding_for_model("gpt-4")
# Your Chinese prompt
prompt = "你好,世界!"
# Tokenize the prompt
tokens = encoding.encode(prompt)
# Count the number of tokens
num_tokens = len(tokens)
print(f"Number of tokens: {num_tokens}")
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 10 days.
This issue was closed because it has been stalled for 10 days with no activity.