CogVideo icon indicating copy to clipboard operation
CogVideo copied to clipboard

Why the use of FP16 instead of BF16 precision?

Open TheTinyTeddy opened this issue 1 year ago • 2 comments

Hi, thank you for the great work!

I was wondering why the precision used for CogVideoX is FP16, whereas other T2V models such as Open-Sora and Open-Sora-Plan use BF16.

Also, I notice in pipeline_cogvideox.py a comment where # pipe = CogVideoXPipeline.from_pretrained("THUDM/CogVideoX-2b", torch_dtype=torch.bfloat16).to("cuda"), which is a different precision to the one used in the example (that is torch.float16).

TheTinyTeddy avatar Aug 07 '24 07:08 TheTinyTeddy

We use bf16 during training, but considering that some GPU do not support bf16 well, we finetune open-source version to fp16

yzy-thu avatar Aug 07 '24 08:08 yzy-thu

Thank you for the swift reply!

I was wondering if there is any ongoing plan for releasing a bf16 version?

TheTinyTeddy avatar Aug 07 '24 08:08 TheTinyTeddy

Pro will use BF16

zRzRzRzRzRzRzR avatar Aug 15 '24 09:08 zRzRzRzRzRzRzR

Pro will use BF16

请问代码里面要想改成bf16做finetune要怎么修改配置呢

lith0613 avatar Aug 27 '24 07:08 lith0613