ColossalAI
ColossalAI copied to clipboard
Optimize the memory usage
I am trying to fine-tune the Llama 13B model using Colossalai. However, the memory usage is quite high, exceeding 270B, and causing an OOM error directly. Is there any way to optimize the memory usage?
Hi, you may test out our strategies such as GeminiDDP as illustrated in the examples.
Hi, you may test out our strategies such as GeminiDDP as illustrated in the examples.
This is the training method we used. The memory usage is too large, unusable. Any suggestion?
What are your environment settings?
Bot detected the issue body's language is not English, translate it automatically. π―ππ»π§βπ€βπ§π«π§πΏβπ€βπ§π»π©πΎβπ€βπ¨πΏπ¬πΏ
What are your environment settings?