aimet icon indicating copy to clipboard operation
aimet copied to clipboard

Model is too big to apply AdaRound

Open yanz0920 opened this issue 1 year ago • 2 comments

What to do when the model is too large to use adaround?

For example, when the model has 6B parameters and dtype is torch.float32, the storage requirements are as follows: model: 24G quantsim_model:24G

But there will be OOM when I runing AdaRound on Nvidia A100, which has 80G cuda memory...

yanz0920 avatar Feb 02 '24 04:02 yanz0920

@quic-hitameht could you help answer this?

quic-mangal avatar Feb 15 '24 01:02 quic-mangal

Hi @yanz0920 During Adaround optimization, we try to put all the cached intermediate activation data for a given layer on GPU for faster optimization whenever possible. In your case, you could disable this optimization by patching AdaroundOptimizer.enable_caching_acts_data method as shown in this unit test.

https://github.com/quic/aimet/blob/develop/TrainingExtensions/torch/test/python/test_adaround_weight.py#L889

Hope this helps. Please let us know if you have further questions.

quic-hitameht avatar Feb 22 '24 16:02 quic-hitameht