COMET
COMET copied to clipboard
Safetensors Support
🚀 Feature
It would ease the model usage if the weights of large (>1B) models were available in .safetensors.
Motivation
As for now, it takes ~1 hour to download unbabel/*-xxl models on A100. It can be resolved with other formats for model weights.
The safetensors format is faster than the Pytorch ones, as stated here: https://huggingface.co/docs/safetensors/en/speed.
Additional context
It seems that safetensors support would require:
- converting the weights for published models (i.e.
save_modelfromsafetensors.torch) - support within the classes, for example, new function for loading safetensors in
class CometModel - another function similar to
load_from_checkpoint
My understanding is that safetensors would not fix the download speed (model size) problem. The link you sent talks about loading the models from the disk.