transformers
transformers copied to clipboard
feat: DeepSeekMoE
What does this PR do?
Upstream custom code from https://huggingface.co/deepseek-ai/deepseek-moe-16b-base/blob/main/modeling_deepseek.py to huggingface/transformers. This is not DeepSeek V2. The newly released DeepSeek-Prover-V1.5 runs on this architecture for example (though without MoE layers, so it is actually just Llama).
https://huggingface.co/models?other=deepseek
Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the contributor guideline, Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [x] Did you write any new necessary tests?
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of who to tag. Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Hey @ArthurZucker , should be ready for review now, thanks!
friendly ping
Any update?
Any update?
Sorry, I got busy, will get back to this though
still nice to have! 🤗