DeepSpeedExamples icon indicating copy to clipboard operation
DeepSpeedExamples copied to clipboard

Why is my model bigger after compression?

Open rlenain opened this issue 1 year ago • 1 comments

Hello,

I am using this config on a translation model (Helsinki-NLP/opus-mt-zh-en), and I check the size of the model using the following function before and after running init_compression and deepspeed.initialize:

def print_size_of_model(model, label=""):
    torch.save(model.state_dict(), "temp.p")
    size=os.path.getsize("temp.p")
    print("model: ",label,' \t','Size (KB):', size/1e3)
    os.remove('temp.p')
    return size

Weirdly, the size of the model increases after running init_compression and deepspeed.initialize. Even after I use redundancy_clean at the end of training and save the model to disk, the size of the model stays what had been returned by print_size_of_model after running init_compression and deepspeed.initialize.

Am I missing something? Can you please explain?

Thanks a lot

rlenain avatar Oct 20 '22 16:10 rlenain