Is it necessary to use Ampere GPU?
运行环境是colab,gpu是tsla T4,torch版本是2.0,CUDA版本是11.8,训练模型的时候出现这个报错:ValueError: Your setup doesn't support bf16/gpu. You need torch>=1.10, using Ampere GPU with cuda>=11.0
我自己家里的3060ti能运行吗,不过是8g显存,运行是是否会溢出。
这是因为T4不支持bf16. 可以将运行命令中的bf16 改为fp16. Ampere GPU对于训练来说不是必需的。
8G显存无法训练LLaMA-7B. 但是可以尝试一下Galactica-1.3B等小一些的模型。
This is because T4 does not support bf16. You can change bf16 to fp16 in the running command. 8GB of memory cannot train LLaMA-7B, but you can try smaller models like Galactica-1.3B.
This issue has been marked as stale because it has not had recent activity. If you think this still needs to be addressed please feel free to reopen this issue. Thanks