dify
dify copied to clipboard
The suggested interface is reporting error 400 in some large models
Self Checks
- [X] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [X] I have searched for existing issues search for existing issues, including closed ones.
- [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [X] Please do not modify this template :) and fill in all the required fields.
Dify version
0.6.9
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
When setting the global default model to GLM-4, the suggested interface returns a 400 error when calling large models, because the GLM-4 large model does not support setting the temperature parameter to 0.0
The problematic code is in dify\api\core\llm_generator\llm_generator.py line:83
response = model_instance.invoke_llm(
prompt_messages=prompt_messages,
model_parameters={
"max_tokens": 256,
"temperature": 0
},
stream=False
)
It is recommended to set the default temperature to 0.1.
✔️ Expected Behavior
Correct suggestions returned
❌ Actual Behavior
The exception in the large model of the interface was caught, and an empty suggestion was returned.