MLServer
MLServer copied to clipboard
build(deps-dev): bump pytorch-lightning from 2.3.3 to 2.4.0 in /runtimes/mlflow
Bumps pytorch-lightning from 2.3.3 to 2.4.0.
Release notes
Sourced from pytorch-lightning's releases.
Lightning v2.4
Lightning AI :zap: is excited to announce the release of Lightning 2.4. This is mainly a compatibility upgrade for PyTorch 2.4 and Python 3.12, with a sprinkle of a few features and bug fixes.
Did you know? The Lightning philosophy extends beyond a boilerplate-free deep learning framework: We've been hard at work bringing you Lightning Studio. Code together, prototype, train, deploy, host AI web apps. All from your browser, with zero setup.
Changes
PyTorch Lightning
- Made saving non-distributed checkpoints fully atomic (#20011)
- Added
dump_statsflag toAdvancedProfiler(#19703)- Added a flag
verboseto theseed_everything()function (#20108)- Added support for PyTorch 2.4 (#20010)
- Added support for Python 3.12 (20078)
- The
TQDMProgressBarnow provides an option to retain prior training epoch bars (#19578)- Added the count of modules in train and eval mode to the printed
ModelSummarytable (#20159)
- Triggering KeyboardInterrupt (Ctrl+C) during
.fit(),.evaluate(),.test()or.predict()now terminates all processes launched by the Trainer and exits the program (#19976)- Changed the implementation of how seeds are chosen for dataloader workers when using
seed_everything(..., workers=True)(#20055)- NumPy is no longer a required dependency (#20090)
- Avoid LightningCLI saving hyperparameters with
class_pathandinit_argssince this would be a breaking change (#20068)- Fixed an issue that would cause too many printouts of the seed info when using
seed_everything()(#20108)- Fixed
_LoggerConnector's_ResultMetricto move all registered keys to the device of the logged value if needed (#19814)- Fixed
_optimizer_to_devicelogic for special 'step' key in optimizer state causing performance regression (#20019)- Fixed parameter counts in
ModelSummarywhen model has distributed parameters (DTensor) (#20163)Lightning Fabric
... (truncated)
Commits
2129fdffix(ci): resolve input str -> num conversion (#20169)cf24a19fix(docs): remove dead link from readme (#20170)a3e60adci/docs: disable optional cache pkg (#20168)87ffd8cci: fix cleaning caches (#20167)b3ee85dPrepare Lightning 2.4.0 release (#20154)631911cAdd special logic for 'step' in _optimizer_to_device (#20019)345450bFix parameter count in ModelSummary when parameters are DTensors (#20163)3de60f4docs: fix typo inlinkcheck_ignore(#20164)e9d4ef8Add diffusion example to README (#20161)d4de8e2Count number of modules in train/eval mode in ModelSummary (#20159)- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)