gpt-neox
gpt-neox copied to clipboard
MoE Support
This PR introduces MoE support modeled after the support in Megatron-DeepSpeed (https://github.com/microsoft/Megatron-DeepSpeed)
This is part of the effort to add/test upstream DeepSpeed features: https://github.com/EleutherAI/gpt-neox/pull/663
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.
:white_check_mark: Quentin-Anthony
:x: anthony.301
anthony.301 seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.