Kaiyan Zhang
Kaiyan Zhang
运行脚本 -> 手动输入分辨率 -> 拔掉显示器接线 -> 关闭重新打开 RDM -> 完事~
我最近也出现了,重装插件也没有用,最后重新安装sublime就好了
貌似还没解决呢。。。
I meet the same issue.
Hello, do you solve this problem? I meet it as well :(
Good job!!
I successfully trained the LLaMA-3-70B model using the script from the official PEFT example: [run_peft_qlora_fsdp.sh](https://github.com/huggingface/peft/blob/main/examples/sft/run_peft_qlora_fsdp.sh). However, I'm still encountering this problem when I set `use_dora=True` in the code.
> pip list | grep nccl to check if you have two versions, you should remove the unnecessary one Thx, I love you!
If you fine-tune LLaMA-2-7B, you maybe need to use bfloat16. ```python model.to(torch.bfloat16) ``` This works fine for me.