Chinese-LLaMA-Alpaca
Chinese-LLaMA-Alpaca copied to clipboard
Lora训练时节省显存删除--modules_to_save ${modules_to_save} \和--gradient_checkpointing \依旧报错
提示:将[ ]中填入x,表示打对钩。提问时删除这行。只保留符合的选项。
详细描述问题
第二训练阶段,fine-tuning 模型的时候 Lora训练时,为了节省显存删除--modules_to_save ${modules_to_save} \和--gradient_checkpointing \依旧报错
请尽量具体地描述您遇到的问题,必要时给出运行命令。这将有助于我们更快速地定位问题所在。
运行截图或日志
必查项目(前三项只保留你要问的)
请删除--modules_to_save ${modules_to_save} \和--gradient_checkpointing \两行(而不是注释掉),再试一下吧
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.
Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.