flying2023

Results 2 comments of flying2023

你好,llama13B版本在推理时只能使用单张A100(或其他大显存机器),请问如果没有A100,如何使用多张例如4张v100代替?

if ckpt_path: print("Load first Checkpoint: {}".format(ckpt_path)) ckpt = torch.load(ckpt_path, map_location="cpu") msg = model.load_state_dict(ckpt['model'], strict=False) ckpt_path_2 = cfg.get("ckpt_2", "") if ckpt_path_2: print("Load second Checkpoint: {}".format(ckpt_path_2)) ckpt = torch.load(ckpt_path_2, map_location="cpu") msg =...