dify icon indicating copy to clipboard operation
dify copied to clipboard

FIX: fix the temperature value of ollama model

Open Yash-1511 opened this issue 1 year ago • 8 comments

Description

set the maximum temperature value of 1 in ollama models.

Fixes # (issue)

Type of Change

Please delete options that are not relevant.

  • [ ] Bug fix (non-breaking change which fixes an issue)
  • [ ] New feature (non-breaking change which adds functionality)
  • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • [ ] This change requires a documentation update, included: Dify Document
  • [ ] Improvement, including but not limited to code refactoring, performance optimization, and UI/UX improvement
  • [ ] Dependency upgrade

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • [ ] TODO

Suggested Checklist:

  • [ ] I have performed a self-review of my own code
  • [ ] I have commented my code, particularly in hard-to-understand areas
  • [ ] My changes generate no new warnings
  • [ ] I ran dev/reformat(backend) and cd web && npx lint-staged(frontend) to appease the lint gods
  • [ ] optional I have made corresponding changes to the documentation
  • [ ] optional I have added tests that prove my fix is effective or that my feature works
  • [ ] optional New and existing unit tests pass locally with my changes

Yash-1511 avatar Apr 30 '24 12:04 Yash-1511

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

takatost avatar May 01 '24 16:05 takatost

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.

openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

Yash-1511 avatar May 01 '24 19:05 Yash-1511

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.

openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.

takatost avatar May 02 '24 07:05 takatost

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion. openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.

I think we need to set temperature range 0-2 in official open ai provider also.

docs of openai This is the from open ai documentation. What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

Yash-1511 avatar May 02 '24 08:05 Yash-1511

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion. openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.

I think we need to set temperature range 0-2 in official open ai provider also.

docs of openai This is the from open ai documentation. What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

Oops, my bad, OpenAI updated the range of temperature values, I didn't notice that. Would you mind helping me adjust the range of temperature values for OpenAI?

takatost avatar May 04 '24 05:05 takatost

Sure

Yash-1511 avatar May 04 '24 14:05 Yash-1511

Hello @takatost ,

Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses. image

when i decrease the temperature below one then it will answer what i want. image

and for adjusting temperature i follow this simple steps.

parameter_rules:
  - name: temperature
    use_template: temperature
    max: 2

Also for testing i update the version of openai but it will not work as expected. do you have any idea?

Yash-1511 avatar May 11 '24 13:05 Yash-1511

Hello @takatost ,

Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses. image

when i decrease the temperature below one then it will answer what i want. image

and for adjusting temperature i follow this simple steps.

parameter_rules:
  - name: temperature
    use_template: temperature
    max: 2

Also for testing i update the version of openai but it will not work as expected. do you have any idea?

Yeah, we tested it too. Once the temperature goes over 1, all sorts of garbled characters start showing up. Just as expected. 😅

And, what issues might occur if updating to the latest version of OpenAI SDK?

takatost avatar May 14 '24 03:05 takatost

Hello @takatost , Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses. image when i decrease the temperature below one then it will answer what i want. image and for adjusting temperature i follow this simple steps.

parameter_rules:
  - name: temperature
    use_template: temperature
    max: 2

Also for testing i update the version of openai but it will not work as expected. do you have any idea?

Yeah, we tested it too. Once the temperature goes over 1, all sorts of garbled characters start showing up. Just as expected. 😅

And, what issues might occur if updating to the latest version of OpenAI SDK?

Same issue after updating OpenAI Sdk also.

Yash-1511 avatar May 16 '24 06:05 Yash-1511