lionday
lionday
When training wizardlmcoder, running according to the training instructions results in an error.(My deivce is A100*4) Allowing ninja to set a default number of workers... (overridable by setting the environment...
"I don't want to use 8-bit training. I hope to use fp16 training. After commenting out these two lines, there was an error. How should I modify it? In addition,...
Is there any specific configuration method? model = AutoModelForCausalLM.from_pretrained(checkpoint,trust_remote_code=True,torch_dtype=torch.float16)