Bo Bao

Results 12 comments of Bo Bao

@sajadghawami do we have any updates per this MR

do we have an eta to enable this function in langchain?

Tested on 0.0.71, Azure OpenAI with stream = true, can't get conversation result returned. (hang in await). Once set to false, or use OpenAI, it works.

Tested in 0.0.72, still same. Using ChatOpenAI with gpt-4-32k

I think it shouldn't be depend on model, 3.5 should also produce the same result.

``` const chatStreaming = new ChatOpenAI({ temperature: 0, streaming: true, azureOpenAIApiKey: azureOpenAIConfig.openApiKey, azureOpenAIApiInstanceName: azureOpenAIConfig.accountName, azureOpenAIApiDeploymentName: azureOpenAIConfig.deploymentName, azureOpenAIApiVersion: '2023-03-15-preview', }); const response = await chatStreaming.call([new HumanChatMessage(message)], undefined, [ { handleLLMNewToken(token: string)...

found the reason, the last message in azure streaming is: {"id":"chatcmpl-7DxBV9AgR8GuvqLyrsmRtDav2M9zJ","object":"chat.completion.chunk","created":1683560125,"model":"gpt-4-32k","choices":[{"index":0,"finish_reason":"stop","delta":{}}],"usage":null}

it doesn't use the stop word [DONE]

I tested with my local axios implementation, azure openai does return the "[DONE]" message