graphrag icon indicating copy to clipboard operation
graphrag copied to clipboard

[Feature Request]: Log LLM Token Usage Information

Open kakao-june-kim opened this issue 7 months ago • 0 comments

Do you need to file an issue?

  • [x] I have searched the existing issues and this feature is not already filed.
  • [x] My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • [x] I believe this is a legitimate feature request, not just a question. If this is a question, please use the Discussions area.

Is your feature request related to a problem? Please describe.

In previous versions, the LLM API calls logged input_tokens and output_tokens, enabling an estimation of usage. However, in the current version, this information is missing, making it difficult to track API usage.

Reference to previous code:

  • https://github.com/microsoft/graphrag/blob/v0.5.0/graphrag/llm/base/rate_limiting_llm.py#L200

Describe the solution you'd like

According to OpenAI's Chat Completion API documentation, the usage field in API responses provides information on token usage. Logging this information would greatly facilitate usage tracking.

Additionally, aggregating this data into statistical usage reports would be highly beneficial.

  • OpenAI Chat Completion API reference: https://platform.openai.com/docs/api-reference/chat/create

Additional context

No response

kakao-june-kim avatar Apr 29 '25 05:04 kakao-june-kim