mmengine icon indicating copy to clipboard operation
mmengine copied to clipboard

Multiple Improvements for mmengine

Open MGAMZ opened this issue 10 months ago • 1 comments

Motivation

During the deep use of the mmengine framework, I improved several subtle issues, hoping to make the project more compatible with the current latest PyTorch version.

Modification

Judgment on the ‘disable’ parameter in the compile function

In the current PyTorch compile configuration (PyTorch Compile Doc), there is a ‘disable’ parameter. The mmengine compile implementation does not judge the ‘disable’ parameter; as long as dict is set as compile, the mmengine will definitely be compiled.

Fix optim state loading BUG when using FSDP

The torch.distributed.fsdp.fully_sharded_data_parallel.FullyShardedDataParallel.optim_state_dict_to_load method requires the following parameters:

  • model
  • optim
  • optim_state_dict
  • is_named_optimizer
  • load_directly
  • group

mmengine's FSDP strategies' call is incorrect, and will cause error.

Update GradScaler to align with the latest PyTorch version

from torch.cuda.amp import GradScaler will raise a PyTorch Warning, this import method will be deprecated in the future.

Update Adafactor to align with the latest PyTorch version

transformers' Adafactor optimizer has been implemented by PyTorch now. So it no longer requires OPTIMIZERS.register_module.

Add Pure-Python style config for OptimWrapperConstructor

The current mmengine does not support Pure-Python style config of OptimWrapperConstructor.

Update torch load to align to the latest PyTorch version.

torch.load requires weights_only param in the future, it currently raises warnings.

Add Pure-Python style config for model_wrapper

The current mmengine does not support Pure-Python style config of model_wrapper.

Improve the warning information in Visualization

The improvement is minor, just add more hints.

MGAMZ avatar Jan 17 '25 16:01 MGAMZ

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

:white_check_mark: MGAMZ
:x: 张贻钦


张贻钦 seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

CLAassistant avatar Jan 17 '25 16:01 CLAassistant

This PR is replaced by https://github.com/open-mmlab/mmengine/pull/1665

MGAMZ avatar Oct 17 '25 08:10 MGAMZ