transformers-bloom-inference
transformers-bloom-inference copied to clipboard
fix checkpoints file list to align with DeepSpeed
When use glob.glob(f"{self.model_path}/*.bin"), files path in the list will all contain model_path prefix. While set it as root_dir will not. And it will align to DeepSpeed's loading way (replace_module.py):
sd = [
torch.load(os.path.join(base_dir1,
checkpoint[i]),
map_location='cpu')
]
Where base_dir1 is duplicate with model_path.
plz help review @mayank31398, thx~
Hi, this repo is no longer maintained