finetuned-qlora-falcon7b-medical icon indicating copy to clipboard operation
finetuned-qlora-falcon7b-medical copied to clipboard

how to enable fp16 instead of bf16 when to train?

Open nobody4t opened this issue 1 year ago • 0 comments

Looks like the newer GPU architecure would support bf16. But mine is a little old. I can change bf16 to fp16 when to train. But the model looks like a bf16 model. is that OK?

Actuall I failed to run the train process due to error below:

requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/api/repos/create

The above exception was the direct cause of the following exception:

It tried to create a new repo in hf?

nobody4t avatar Aug 08 '23 03:08 nobody4t