LLaVA
LLaVA copied to clipboard
[Usage] The deterministic mode did not set in eval_model() function
Describe the issue
Issue: The internal parameter for deterministic mode did not set in eval_model() func
According to the example code for eval_model(), temperature
parameter set 0
.
model_path = "liuhaotian/llava-v1.5-7b"
prompt = "What are the things I should be cautious about when I visit here?"
image_file = "https://llava-vl.github.io/static/images/view.jpg"
args = type('Args', (), {
"model_path": model_path,
"model_base": None,
"model_name": get_model_name_from_path(model_path),
"query": prompt,
"conv_mode": None,
"image_file": image_file,
"sep": ",",
"temperature": 0,
"top_p": None,
"num_beams": 1,
"max_new_tokens": 512
})()
eval_model(args)
I think if we set this param 0, we should explicitly set these additional params.
torch.use_deterministic_algorithms = True
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
Hence, we should add the following conditional execution in eval_model() func.
if args.temperature == 0:
torch.use_deterministic_algorithms = True
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
else:
torch.use_deterministic_algorithms = False
torch.backends.cudnn.deterministic = False
torch.backends.cudnn.benchmark = True