StableLM
StableLM copied to clipboard
Colab OOM
Hey THanks for the code. Ironically even the 3B model is crashing on Colab. This is after enabling 8-bit with fp16 precision.
Did it work for anyone?
Had the same error, seems like that the CPU RAM is not enough to load the model before sending it to the GPU.
Maybe this is a reason - https://github.com/Stability-AI/StableLM/issues/6
Yep, that's why, and there's solutions on that thread!