chainlit icon indicating copy to clipboard operation
chainlit copied to clipboard

AttributeError: 'ChatCompletionChunk' object has no attribute 'get'

Open kowlcode opened this issue 1 year ago • 2 comments
trafficstars

Describe the bug I am encountering an AttributeError while integrating LlamaIndex with Chainlit. I am trying to stream the response generated by the Chat Engine (GPT-4 and 4o) while using the LlamaIndexCallbackHandler, but when the response message is finished, I get the following error:

AttributeError: 'ChatCompletionChunk' object has no attribute 'get' 2024-07-26 16:30:30 - 'ChatCompletionChunk' object has no attribute 'get' [...] File "...\llama_index\core\callbacks\base.py", line 131, in on_event_end handler.on_event_end(event_type, payload, event_id=event_id, **kwargs) File "...\chainlit\llama_index\callbacks.py", line 163, in on_event_end model = raw_response.get("model", None) if raw_response else None ^^^^^^^^^^^^^^^^ [...]

Versions: LlamaIndex 0.10.58 Chainlit: 1.1.400 LLMs: GPT-4 and GPT-4o over AzureOpenAI

Workaorund

under chainlit/llama_index/callbacks.py/LlamaIndexCallbackHandler

I imported:

from openai.types.chat.chat_completion_chunk import ChatCompletionChunk

and added under on_event_end:

if raw_response: 
    if isinstance(raw_response, ChatCompletionChunk):
        model=raw_response.model
    else:
        model = raw_response.get("model", None)
else:
    model=None

instead of this:

model = raw_response.get("model", None) if raw_response else None

kowlcode avatar Jul 30 '24 14:07 kowlcode