MotionGPT
MotionGPT copied to clipboard
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
An issue arises in the code at line 3284 of modeling_utils.py, specifically with the following snippet:
with safe_open(resolved_archive_file, framework="pt") as f
Have you encountered this error before, and do you have any insights into its cause?
I'm getting the same error, quick google search say's it's an issue with the archive the weights are stored in.
Traceback (most recent call last): File "/./diffumask/myenv/lib/python3.8/site-packages/diffusers/models/modeling_utils.py", line 102, in load_state_dict return safetensors.torch.load_file(checkpoint_file, device="cpu") File "/./diffumask/myenv/lib/python3.8/site-packages/safetensors/torch.py", line 259, in load_file with safe_open(filename, framework="pt", device=device) as f: safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
facing the same problem too
Hello,
The SafetensorError: Error while deserializing header: HeaderTooLarge you've encountered is similar to issue #68, often due to issues with downloading checkpoints. To address this, please use Git LFS to download the necessary large files:
git lfs pull --include="<files to download>" --exclude="<files to exclude>"
Replace <files to download> and <files to exclude> as needed. This should download the actual file content, resolving the error.
edit: this worked, the issue was an incorrectly downloaded flan-t5 checkpoint, not the motiongpt checkpoint. Thanks!
sudo apt-get install git-lfs may also be needed if on Ubuntu