transformers
transformers copied to clipboard
When I use Trainer with Deepspeed, the Number of trainable parameters is 0
The version information is as follows:
- Deepspeed. 0.8.1
- transformers. 4.26.1
Problem
When I use Trainer with Deepspeed, the Number of trainable parameters is 0. Like this:

And it happens when using zero3. When I use zero2, it does not have this problem.
cc @stas00
Thank you for the report, @noob-ctrl
Please let me know if this fix works for you: https://github.com/huggingface/transformers/pull/22193
@stas00 Hi, it works now. Thank you!
Thank you for testing, @noob-ctrl - the PR has been merged.