onenotell

Results 2 comments of onenotell

借楼问一下, if finetuning_args.stage in ["rm", "ppo"] and finetuning_args.finetuning_type in ["full", "freeze"]: can_resume_from_checkpoint = False if training_args.resume_from_checkpoint is not None: logger.warning("Cannot resume from checkpoint in current stage.") training_args.resume_from_checkpoint = None else:...

> 参考:[#559](https://github.com/ymcui/Chinese-LLaMA-Alpaca/issues/559#issuecomment-1585948697) 修改transformers源码 site-packages/transformers/integrations/deepspeed.py ![image](https://private-user-images.githubusercontent.com/30854283/310737599-e4113b01-8401-4f89-ae2b-3f329edf7256.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTA0NzAyNjYsIm5iZiI6MTcxMDQ2OTk2NiwicGF0aCI6Ii8zMDg1NDI4My8zMTA3Mzc1OTktZTQxMTNiMDEtODQwMS00Zjg5LWFlMmItM2YzMjllZGY3MjU2LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDAzMTUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwMzE1VDAyMzI0NlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTAzZGJjYjJkMDg0N2YyZGY2ZDJlNzE5MWYwN2UxNTM5OWQxNWFkOTZkNzE3ZGRlODVlZDU4YWQxMDVlNzEyNzkmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.u0kt0svV8wCplYE3Y9ptcA3EQ7XfHnHP8GJqOmOw6Aw) 可解决问题 python3.10 transformers 4.28.2 可以直接用shell修改打包,测试可用 sed -i 's/self.model_wrapped, resume_from_checkpoint, load_module_strict=not _is_peft_model(self.model)/self.model_wrapped, resume_from_checkpoint, load_module_strict=False/g' /usr/local/lib/python3.10/dist-packages/transformers/trainer.py