FIX: fix the temperature value of ollama model
Description
set the maximum temperature value of 1 in ollama models.
Fixes # (issue)
Type of Change
Please delete options that are not relevant.
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
- [ ] This change requires a documentation update, included: Dify Document
- [ ] Improvement, including but not limited to code refactoring, performance optimization, and UI/UX improvement
- [ ] Dependency upgrade
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
- [ ] TODO
Suggested Checklist:
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] My changes generate no new warnings
- [ ] I ran
dev/reformat(backend) andcd web && npx lint-staged(frontend) to appease the lint gods - [ ]
optionalI have made corresponding changes to the documentation - [ ]
optionalI have added tests that prove my fix is effective or that my feature works - [ ]
optionalNew and existing unit tests pass locally with my changes
Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.
Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.
I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.
openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.
Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.
I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.
openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.
You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.
Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.
I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion. openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.
You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.
I think we need to set temperature range 0-2 in official open ai provider also.
docs of openai This is the from open ai documentation. What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.
I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion. openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.
You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.
I think we need to set temperature range 0-2 in official open ai provider also.
docs of openai This is the from open ai documentation. What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
Oops, my bad, OpenAI updated the range of temperature values, I didn't notice that. Would you mind helping me adjust the range of temperature values for OpenAI?
Sure
Hello @takatost ,
Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses.
when i decrease the temperature below one then it will answer what i want.
and for adjusting temperature i follow this simple steps.
parameter_rules:
- name: temperature
use_template: temperature
max: 2
Also for testing i update the version of openai but it will not work as expected. do you have any idea?
Hello @takatost ,
Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses.
when i decrease the temperature below one then it will answer what i want.
and for adjusting temperature i follow this simple steps.
parameter_rules: - name: temperature use_template: temperature max: 2Also for testing i update the version of openai but it will not work as expected. do you have any idea?
Yeah, we tested it too. Once the temperature goes over 1, all sorts of garbled characters start showing up. Just as expected. 😅
And, what issues might occur if updating to the latest version of OpenAI SDK?
Hello @takatost , Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses.
when i decrease the temperature below one then it will answer what i want.
and for adjusting temperature i follow this simple steps.
parameter_rules: - name: temperature use_template: temperature max: 2Also for testing i update the version of openai but it will not work as expected. do you have any idea?
Yeah, we tested it too. Once the temperature goes over 1, all sorts of garbled characters start showing up. Just as expected. 😅
And, what issues might occur if updating to the latest version of OpenAI SDK?
Same issue after updating OpenAI Sdk also.

