dify
dify copied to clipboard
azure openai o3-mini expecting stream response but sete stream false in request.
Self Checks
- [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [x] Please do not modify this template :) and fill in all the required fields.
Dify version
1.0.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I installed azure openai plugin 0.0.8, when using o3-mini modle, stream always set to false in http request, but the plugin is expecting stream response.
2025-03-12 14:57:47.584 ERROR [Thread-411 (_generate_worker)] [app_generator.py:257] - Unknown Error when generating
Traceback (most recent call last):
File "/app/api/core/app/apps/agent_chat/app_generator.py", line 237, in _generate_worker
runner.run(
File "/app/api/core/app/apps/agent_chat/app_runner.py", line 245, in run
self._handle_invoke_result(
File "/app/api/core/app/apps/base_app_runner.py", line 271, in _handle_invoke_result
self._handle_invoke_result_stream(invoke_result=invoke_result, queue_manager=queue_manager, agent=agent)
File "/app/api/core/app/apps/base_app_runner.py", line 312, in _handle_invoke_result_stream
text += result.delta.message.content
TypeError: can only concatenate str (not "list") to str
✔️ Expected Behavior
No response
❌ Actual Behavior
No response