aaastar
aaastar
ckp_path = f'{args.save_dir}/pretrain_{lm_config.dim}.pth' if os.path.exists(ckp_path): Logger(f"发现已有权重文件 {ckp_path},加载中...") state_dict = torch.load(ckp_path, map_location=args.device) if isinstance(model, torch.nn.parallel.DistributedDataParallel): model.module.load_state_dict(state_dict) else: model.load_state_dict(state_dict) Logger("权重加载完成,继续训练") else: Logger("未发现已有权重文件,从头开始训练")
### Required prerequisites - [x] I have read the documentation . - [x] I have searched the [Issue Tracker](https://github.com/camel-ai/camel/issues) and [Discussions](https://github.com/camel-ai/camel/discussions) that this hasn't already been reported. (+1 or comment...
camel很好,就是这方面比较弱,希望加强