LLMs-Finetuning-Safety icon indicating copy to clipboard operation
LLMs-Finetuning-Safety copied to clipboard

temp not zero during inference

Open ShengYun-Peng opened this issue 9 months ago • 2 comments

Thanks for your great work! The paper said the temperature and top_p were set to 0 during inference, but the code here shows the temp is set to 1. Perhaps top_p = 0 is already greedy decoding? https://github.com/LLM-Tuning-Safety/LLMs-Finetuning-Safety/blob/8a3b38f11be1c3829e2b0ed379d3661ebc84e7db/llama2/safety_evaluation/question_inference.py#L47

ShengYun-Peng avatar May 10 '24 22:05 ShengYun-Peng

Hi, thanks for pointing out this. I believe you are right --- by setting top_p = 0, it is already greedy.

Unispac avatar May 15 '24 04:05 Unispac

Thanks!

ShengYun-Peng avatar May 15 '24 12:05 ShengYun-Peng