lion-pytorch icon indicating copy to clipboard operation
lion-pytorch copied to clipboard

add an 8-bit version with bitsandbytes

Open lucidrains opened this issue 2 years ago • 5 comments

https://github.com/TimDettmers/bitsandbytes/blob/main/compile_from_source.md

lucidrains avatar Mar 08 '23 15:03 lucidrains

recent results have tipped me over to recommending this optimizer under the condition of high batch sizes (and mainstream architectures). will put some more work into it

lucidrains avatar Mar 08 '23 15:03 lucidrains

https://github.com/TimDettmers/bitsandbytes/pull/188

lucidrains avatar Mar 09 '23 16:03 lucidrains

if there are any CUDA / 8-bit optimizer experts in the crowd, would welcome a code review over at the bitsandbytes repository

lucidrains avatar Mar 10 '23 17:03 lucidrains

@lucidrains Is there a reason behind the 64 per_device_train_batch_size threshold recommendation? With consumer 24gb gpus, it is almost impossible to train or finetune any meaningful model with that high batch size to take advantage of lion. It would be nice if lion is applicable to smaller batches? Thanks

Qubitium avatar May 19 '23 11:05 Qubitium

The original authors note that Lion likely doesn't show significant gains over AdamW under the small batch (<64) regime. See https://arxiv.org/pdf/2302.06675.pdf

wukevin avatar Jun 18 '23 19:06 wukevin