NextChat
NextChat copied to clipboard
[Feature] Top_P
- Why we are not using 2 decimal instead of 1 i mean like Top_p 0.95
- Sorry, I'm not good at TypeScript.
<ListItem title={Locale.Settings.TopP.Title}
subTitle={Locale.Settings.TopP.SubTitle}
>
<InputRange
value={(props.modelConfig.top_p ?? 1).toFixed(2)}
min="0"
max="1"
step="0.01"
onChange={(e) => {
props.updateConfig(
(config) =>
(config.top_p = ModalConfigValidator.top_p(
e.currentTarget.valueAsNumber,
)),
);
}}
></InputRange>
</ListItem>
Because there will be the next question: So why use 2 instead of 1?
A value of 0.95 strikes a good balance between efficiency and effectiveness on coding task, making it suitable for smaller models like the qwen 1.5 7b.
It's due to OpenAI's default settings, and I'm not certain if it supports being adjusted to 2 decimal
such as Top_p 0.95
.
Ref: https://platform.openai.com/docs/api-reference/chat/create#chat-create-top_p
@H0llyW00dzZ: sorry for the delay. In my local tests, both 0.9 and 0.95 (which are floating point values) worked fine. It seems that the chatbox is already using them,
- as a reference: https://github.com/Bin-Huang/chatbox.