accelerate
accelerate copied to clipboard
TypeError: Accelerator.__init__() got an unexpected keyword argument 'adjust_scheduler_to_accumulation'
System Info
I just looked into the code and commit history and it looks like this parameter was suggested in the gradient accumulation guide and examples, but has never actually made it into the codebase.
https://github.com/search?q=repo%3Ahuggingface%2Faccelerate+adjust_scheduler_to_accumulation&type=commits
You can see that the commit that introduced that line in the documentation, doesn't actually add it to the init object, which is not passed to the GradientAccumulationPlugin.
However, this doesn't seem to be a giant issue as by default the parameter in GradientAccumulationPlugin is set to True. https://github.com/huggingface/accelerate/blob/bfa74e51d2af08221f5787d281d681ca9bceddd2/src/accelerate/utils/dataclasses.py#LL394C3-L399C6
The fix here would to properly add the 'adjust_scheduler_to_accumulation' option to the Accelerator object and passing that down into the GradientAccumulationPlugin when it is instantiated.
I'm happy to provide a PR for this if you'd like.
Information
- [X] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] One of the scripts in the examples/ folder of Accelerate or an officially supported
no_trainerscript in theexamplesfolder of thetransformersrepo (such asrun_no_trainer_glue.py) - [ ] My own task or dataset (give details below)
Reproduction
- Attempt to instantiate the Accelerator object while passing 'adjust_scheduler_to_accumulation' as either true or false.
Expected behavior
Per the https://github.com/huggingface/accelerate/blob/bfa74e51d2af08221f5787d281d681ca9bceddd2/docs/source/usage_guides/gradient_accumulation.mdx?plain=1#L117
A user should be able to pass in adjust_scheduler_to_accumulation without raising an error.
@iantbutler01 the docs need to be updated here, as we decided to go with the plug-in rather than as a param to the __init__ (to not have too many parameters). A PR clarifying to use the plug in would be great!
Got it, I will make that update and open a PR! @muellerzr
@muellerzr https://github.com/huggingface/accelerate/pull/1461 thats the doc PR also caught and fixed a failing test for bf16 mixed precision on an m1 macbook, looks like bf16 isn't supported on mps
@iantbutler01 good to close this issue? :)
Yup! Sorry, thanks for helping me get these doc changes in