llama_index icon indicating copy to clipboard operation
llama_index copied to clipboard

[Bug]: token_counting.py fails with 0.10.57

Open ttozser opened this issue 7 months ago • 2 comments

Bug Description

I receive this exception:

File "/usr/local/lib/python3.11/site-packages/llama_index/core/callbacks/token_counting.py", line 56, in get_llm_token_counts
usage = response.raw.get("usage", None)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/pydantic/main.py", line 828, in __getattr__
raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'ChatCompletion' object has no attribute 'get'

This change causes this: https://github.com/run-llama/llama_index/commit/1e361b5c23e1a3048c31287220729fa811b8af4f#diff-494476f823ae151033c01737238aacb8bc381a5e54e2491d81678501a499b2e4L90

The type of the raw field in the ChatResponse was changed from dict to any.

Here the raw value is the response which is a ChatCompletion https://github.com/run-llama/llama_index/blob/v0.10.57/llama-index-integrations/llms/llama-index-llms-openai/llama_index/llms/openai/base.py#L417

BaseModel used to convert this ChatCompletion to a dict but after this change, it remains a ChatCompletion.

The get_llm_token_counts method expects here a dict https://github.com/run-llama/llama_index/blob/v0.10.57/llama-index-core/llama_index/core/callbacks/token_counting.py#L56

Version

0.10.57

Steps to Reproduce

Do a query with token counting enabled.

Relevant Logs/Tracbacks

File "/usr/local/lib/python3.11/site-packages/llama_index/core/callbacks/token_counting.py", line 56, in get_llm_token_counts
usage = response.raw.get("usage", None)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/pydantic/main.py", line 828, in __getattr__
raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'ChatCompletion' object has no attribute 'get'

ttozser avatar Jul 23 '24 15:07 ttozser