lion-pytorch icon indicating copy to clipboard operation
lion-pytorch copied to clipboard

Same amount of VRAM is taken as in AdamW

Open VCasecnikovs opened this issue 1 year ago • 6 comments

One of the main benefits of LION, is it needs to save less data for each param. Adam needs to save Momentum and RMSProp ema's, while in LION we need to save only momentum ema. When I try to use LION, it takes exactly the same amount of memory as AdamW

VCasecnikovs avatar Apr 03 '23 11:04 VCasecnikovs

Hi, what is the model size in your setting? When the model is small, I think the main memory overhead comes from the activation, so the saved second moment may not be significant.

xiangning-chen avatar Apr 06 '23 21:04 xiangning-chen

@xiangning-chen 178m parameters, convolutional.

VCasecnikovs avatar Apr 07 '23 09:04 VCasecnikovs

Are you comparing this to AdamW8bit by chance?

feffy380 avatar Apr 07 '23 22:04 feffy380

No, to AdamW

VCasecnikovs avatar May 27 '23 15:05 VCasecnikovs

In my setting, Lion takes less memory than AdamW (9.9 Gb vs 10.1Gb) but Lion is slower in terms of steps/sec. Has anyone noticed the same? I compare Lion with triton vs fused AdamW.

konev-artem avatar Jun 01 '23 18:06 konev-artem

do you solve the problem? i have the same problem.

nicosouth avatar May 14 '24 06:05 nicosouth