finetuned-qlora-falcon7b-medical
finetuned-qlora-falcon7b-medical copied to clipboard
how to enable fp16 instead of bf16 when to train?
Looks like the newer GPU architecure would support bf16. But mine is a little old. I can change bf16 to fp16 when to train. But the model looks like a bf16 model. is that OK?
Actuall I failed to run the train process due to error below:
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/api/repos/create
The above exception was the direct cause of the following exception:
It tried to create a new repo in hf?