mmpose icon indicating copy to clipboard operation
mmpose copied to clipboard

[Feature] Support STARLoss (CVPR'2023)

Open chg0901 opened this issue 2 years ago • 1 comments

Motivation

Support STAR loss in MMPose.

[Refered repo] https://github.com/ZhenglinZhou/STAR/blob/master/lib/loss/starLoss_v2.py Train and test on WFLW dataset to reproduce the accuracy in paper Train models on other datasets

Modification

Add files

  • configs/face_2d_keypoint/topdown_regression/wflw/resnet_starloss_wflw.md
  • configs/face_2d_keypoint/topdown_regression/wflw/resnet_starloss_wflw.yml
  • configs/face_2d_keypoint/topdown_regression/wflw/td-reg_res50_starloss_8xb64-210e_wflw-256x256.py
  • mmpose/models/heads/heatmap_heads/star_head.py

Modify exist files

  • mmpose/models/heads/heatmap_heads/__init__.py
  • mmpose/models/losses/__init__.py
  • mmpose/models/losses/regression_loss.py

Use cases (Optional)

When we use the STAR Loss, we should change the model.neck and model.head ('STARHead' and 'STARLoss' in model.head)

# model settings
model = dict(
    # replace GlobalAveragePooling with FeatureMapProcessor
    # to obtain heatmap output
    # neck=dict(type='GlobalAveragePooling'),
    neck=dict(
        type='FeatureMapProcessor',
        concat=True,
    ),
    # using STARLoss with STARHead
    head=dict(
        type='STARHead',
        in_channels=2048,
        out_channels=98,
        deconv_out_channels=None,
        loss=dict(type='STARLoss', use_target_weight=True),
        decoder=codec_star),
    train_cfg=dict(),
    test_cfg=dict(
        flip_test=True,
        shift_coords=True,
    ))

Current process

  • main algorithm and data pipeline is adjusted with the wflw dataset and starloss algorithm

Problem to check and solve

  • the training is not converged, needs helps from the main contributors of MMPose

Checklist

Before PR:

  • [x] I have read and followed the workflow indicated in the CONTRIBUTING.md to create this PR.
  • [x] Pre-commit or linting tools indicated in CONTRIBUTING.md are used to fix the potential lint issues.
  • [ ] Bug fixes are covered by unit tests, the case that causes the bug should be added in the unit tests.
  • [ ] New functionalities are covered by complete unit tests. If not, please add more unit tests to ensure correctness.
  • [ ] The documentation has been modified accordingly, including docstring or example tutorials.

After PR:

  • [x] CLA has been signed and all committers have signed the CLA in this PR.

chg0901 avatar Aug 22 '23 14:08 chg0901

CLA assistant check
All committers have signed the CLA.

CLAassistant avatar Aug 22 '23 14:08 CLAassistant