SSF icon indicating copy to clipboard operation
SSF copied to clipboard

Where was the implementation of `tuning_mode`?

Open 2catycm opened this issue 1 year ago • 2 comments

In train.py the argument

parser.add_argument('--tuning-mode', default=None, type=str,
                    help='Method of fine-tuning (default: None')

is later passed to create_model method

model = create_model(
        args.model,
        pretrained=args.pretrained,
        num_classes=args.num_classes,
        drop_rate=args.drop,
        drop_connect_rate=args.drop_connect,  # DEPRECATED, use drop_path
        drop_path_rate=args.drop_path,
        drop_block_rate=args.drop_block,
        global_pool=args.gp,
        bn_momentum=args.bn_momentum,
        bn_eps=args.bn_eps,
        scriptable=args.torchscript,
        checkpoint_path=args.initial_checkpoint,
        tuning_mode=args.tuning_mode)

and else where I cannot see the usage of this variable args.tuning_mode. However, create_model is from timm

from timm.models import create_model, safe_model_name, resume_checkpoint, load_checkpoint,\
    convert_splitbn_model, model_parameters

Did you implement SSF's logics in timm lib?

2catycm avatar Aug 24 '24 08:08 2catycm

Now I got some idea, SSF's logics are in model folder, and timm will pass additional args into the model init method. But I am still confused why timm would call the manually written model code instead of the timm's model code.

2catycm avatar Aug 24 '24 13:08 2catycm

I had the same question. I think that it is because the @register_model decorator in models/vision_transformer.py. Refer to https://blog.csdn.net/weixin_47994925/article/details/129745845 and https://zhuanlan.zhihu.com/p/616239771.

SydCS avatar Sep 06 '24 12:09 SydCS