ColossalAI
ColossalAI copied to clipboard
[feat] Add distributed lamb; minor fixes in DeviceMesh and comments
📌 Checklist before creating the PR
- [ ] I have created an issue for this PR for traceability
- [x] The title follows the standard format:
[doc/gemini/tensor/...]: A concise description - [x] I have added relevant tags if possible for us to better distinguish different PRs
🚨 Issue number
Link this PR to your issue with words like fixed to automatically close the linked issue upon merge
e.g.
fixed #1234,closed #1234,resolved #1234
📝 What does this PR do?
Summarize your work here.
- Add distributed Lamb supporting Tensor Parallel and ZeRO stage 2
- Add bias correction to Lamb
- Fixed DeviceMesh failing to map rank to "squeezable" axis (e.g. axis 1 in mesh shape (4, 1) )
- Minor improvements on comments.
if you have any plots/diagrams/screenshots/tables, please attach them here.
💥 Checklist before requesting a review
- [ ] I have linked my PR to an issue (instruction)
- [x] My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
- [x] I have performed a self-review of my code
- [x] I have added thorough tests.
- [ ] I have added docstrings for all the functions/methods I implemented
⭐️ Do you enjoy contributing to Colossal-AI?
- [x] 🌝 Yes, I do.
- [x] 🌚 No, I don't.
Tell us more if you don't enjoy contributing to Colossal-AI.