intel-extension-for-pytorch icon indicating copy to clipboard operation
intel-extension-for-pytorch copied to clipboard

Does BFloat16 model training supported on ARC?

Open XinyuYe-Intel opened this issue 11 months ago • 8 comments

Describe the issue

Hi, I tried to finetune Llama2 on ARC with BFloat16 data type and AdamW optimizer using transformers.Trainer, but I met RuntimeError: parameter in optimizer(Adamw) is not FP32, need check from intel_extension_for_pytorch/optim/_functional.py", line 1256, in adamw_step, so I want to confirm that does BFloat16 model training supported on ARC now? If not, could you please add this feature? PS: I'm using dev/QLLM branch commit id d600370d44882ae8e15949452fe1d9f324cf6900

XinyuYe-Intel avatar Feb 28 '24 02:02 XinyuYe-Intel