MART icon indicating copy to clipboard operation
MART copied to clipboard

Update torch requirement from ~=1.13.1 to ~=2.0.1

Open dependabot[bot] opened this issue 1 year ago • 5 comments

You can trigger a rebase of this PR by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
> **Note** > Automatic rebases have been disabled on this pull request as it has been open for over 30 days.

dependabot[bot] avatar May 09 '23 16:05 dependabot[bot]

We should also upgrade to the latest template: https://github.com/ashleve/lightning-hydra-template/compare/v1.4.0...v2.0.2

Notable changes

  • pytorch_lightning -> lightning.pytorch
  • use lightning.fabric for TPU
  • upgrade to hydra 1.3
  • upgrade toolkits in pre-commit
  • config datamodule -> data
  • add a cpu trainer config
  • move src.tasks to src
  • stop exporting extra log after a task exception.
  • add aim as a logger
  • split src.utils.utils into several .py files
  • update .gitignore for the aim logger

mzweilin avatar Jun 02 '23 19:06 mzweilin

Changes in pytorch-lightning ~= 1.6.5 -> lightning ~= 2.0.2.

  • import pytorch_lightning as pl -> from lightning import pytorch as pl
  • LightningModule: training_epoch_end(self, outputs) -> on_train_epoch_end(self).
  • LightningModule: remove training_step_end()
  • LightningModule: change arguments configure_gradient_clipping()

mzweilin avatar Jun 05 '23 16:06 mzweilin

lightning 2.0.2 depends on torchmetrics<2.0 and >=0.7.0.

However, we want to keep torchmetrics == 0.6.0 because mAP is super slow in later versions.

I hope torchmetrics will change the backend of mAP soon in the upcoming release.

Changes in torchmetrics == 0.6.0 -> torchmetrics == 0.11.4

  • Accuracy requires num_classes in arguments.
  • torchmetrics.detection.MAP -> torchmetrics.detection.mean_ap.MeanAveragePrecision

mzweilin avatar Jun 05 '23 17:06 mzweilin

The reason we used torchmetrics == 0.6.0 is because MeanAveragePrecision is super slow in newer versions. It looks like they're finally going to revert back to the original implementation that uses the COCOapi: https://github.com/Lightning-AI/torchmetrics/issues/1024.

dxoigmn avatar Jun 05 '23 17:06 dxoigmn

Should this be closed @mzweilin?

dxoigmn avatar Jul 14 '23 19:07 dxoigmn