transformers icon indicating copy to clipboard operation
transformers copied to clipboard

Fix initialization of a pretrained backbone

Open bvantuan opened this issue 7 months ago • 3 comments

What does this PR do?

Fixes #38061

The backbone is initialized only once when use_pretrained_backbone=True, that means _is_hf_initialized=True, so we should do nothing in the _initialize_weights function when _is_hf_initialized=True. Remove the redundant line and backbone_checkpoint is None.

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @Rocketknight1 @qubvel @NielsRogge

bvantuan avatar Jun 01 '25 07:06 bvantuan

Update: The backbone is initialized twice due to this code.

https://github.com/huggingface/transformers/blob/51d732709e5ae424e8fb6c4e58b72057a3e413c2/src/transformers/models/mask2former/modeling_mask2former.py#L2131-L2136

Since the encoder is already initialized by the load_backbone function, it is sufficient for Mask2FormerPixelLevelModule to iterate only over the decoder’s modules. I also added a test to verify the initialization of the pretrained backbone. WDYT? @Rocketknight1 @qubvel @NielsRogge

bvantuan avatar Jun 03 '25 03:06 bvantuan

Hi @bvantuan Thank you a lot for the fix.

@Cyrilvallez is the expert of initialization stuff (and many others).

ydshieh avatar Jun 03 '25 09:06 ydshieh

Cc @Cyrilvallez ! Could you please review when convenient?

bvantuan avatar Jun 11 '25 12:06 bvantuan