llm-graph-builder
llm-graph-builder copied to clipboard
Bump accelerate from 1.7.0 to 1.11.0 in /backend
Bumps accelerate from 1.7.0 to 1.11.0.
Release notes
Sourced from accelerate's releases.
v1.10.1: Patchfix
- Feat: add to_json by
@S1ro1in huggingface/accelerate#3743- Protect import for device_mesh by
@SunMarcin huggingface/accelerate#3742.Full Changelog: https://github.com/huggingface/accelerate/compare/v1.10.0...v1.10.1
v1.10.0: N-D Parallelism
N-D Parallelism
Training large models across multiple GPUs can be complex, especially when combining different parallelism strategies (e.g TP, CP, DP). To simplify this process, we've collaborated with Axolotl to introduce an easy-to-use integration that allows you to apply any combination of parallelism strategies directly in your training script. Just pass a
ParallelismConfigspecifying the size of each parallelism type—it's that simple. Learn more about how it works in our latest blogpost.parallelism_config = ParallelismConfig( dp_shard_size=2, dp_replicate_size=2, cp_size=2, tp_size=2, ) accelerator = Accelerator( parallelism_config=parallelism_config, ... ) model = AutoModelForCausalLM.from_pretrained("your-model-name", device_mesh=accelerator.torch_device_mesh) model = accelerator.prepare(model)
- Parallelism config + TP + HSDP + BYODM (Bring Your Own Device Mesh) by
@SalmanMohammadiin huggingface/accelerate#3682- Feat: context parallel v2.0 by
@S1ro1in huggingface/accelerate#3700- set default submesh_tp_size to prevent unset local variable error by
@winglianin huggingface/accelerate#3687- Add Parallelism getter property to Accelerator class by
@WoosungMyungin huggingface/accelerate#3703- Fix: prepare works even if nothing except tp specified (rare) by
@S1ro1in huggingface/accelerate#3707- Set parallelism_config in constructor due to Trainer reset of State by
@winglianin huggingface/accelerate#3713- Fix: tp size wouldn't read from env by
@S1ro1in huggingface/accelerate#3716- Remove
ParallelismConfigfromPartialStateby@SunMarcin huggingface/accelerate#3720FSDP improvements
We've fixed ignored modules attribute. With this, it is now possible to train PEFT model that moe layers that contrains
q_projandv_projparameters. This is especially important for fine-tuninggpt-ossmodel.
- ENH: Allow FSDP ignored modules to be regex by
@BenjaminBossanin huggingface/accelerate#3698- TST Add test for FSDP ignored_modules as str by
@BenjaminBossanin huggingface/accelerate#3719Minor improvements
- feature: CpuOffload pre_forward don't attempt to move if already on device by
@JoeGaffneyin huggingface/accelerate#3695- Fix: Ensure environment variable values are case-insensitive in Accelerate by
@jp1924in huggingface/accelerate#3712- remove use_ipex by
@SunMarcin huggingface/accelerate#3721New Contributors
... (truncated)
Commits
9a81156Release: v1.11.05998f86refactor: nit change for get_parameters_from_modules (code debt) (#3815)f0313a6Fix tracking swanlab (#3810)df0c187Bump to python3.10 + update linter (#3809)bc2478afix (#3808)057edecfix (skip) cache flush when original device iscpuand offloaded to disk `m...1438331Remove deprecated FindTiedParametersResult (#3786)a737437Revert "fix: correct dictionary unpacking in recursively_apply function (#376...6997855rm mlflow (#3783)401075fAdd optional typing (#3769)- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)