binmakeswell
binmakeswell
Hi @l241025097 You are right! Could you please submit a PR to fix it? Thank you very much for your contribution!
This issue was closed due to inactivity. Thanks.
Hi @ziyuwan @taishiciR As mentioned in [Chat example](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat/examples), --strategy 'colossalai_gemini' or 'colossalai_zero2' is enough for most cases. TP is not supported for Chat currently. It is relatively low on our...
Hi @evi-Genius @taishiciR As mentioned in [Chat example](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat/examples), –strategy ‘colossalai_gemini’ or ‘colossalai_zero2’ is enough for most cases. PP is not supported for Chat currently. It is relatively low on our...
Hi @jeffra Thank you so much for your contribution!
This issue was closed due to inactivity. If you have further questions, please [open another new issue](https://github.com/hpcaitech/ColossalAI/issues/new/choose) and provide details. Thanks.
> @ver217 I trained opt 66b on the single A100. Hi @MikeChenfu 66b is much larger than the capacity of one A100 80GB memory. This issue was closed due to...
Hi @guohe369 We have updated a lot. Please check the latest code. This issue was closed due to inactivity. Thanks.
Hi @Pradeep-Vanapalli We have updated a lot. Please check the latest code. This issue was closed due to inactivity. Thanks.
Hi @GeneZC Welcome to share your findings with new issues or discussion. This issue was closed due to inactivity. Thanks.