SSF icon indicating copy to clipboard operation
SSF copied to clipboard

[NeurIPS'22] This is an official implementation for "Scaling & Shifting Your Features: A New Baseline for Efficient Model Tuning".

Results 8 SSF issues
Sort by recently updated
recently updated
newest added

Hi, I am trying your SSF method by inserting it in the ResBlocks of my network after the norm and conv operations. I use a pretrained model, freeze its weights...

Thank you for your excellent work! I can't find the re-parameterization code. Could you please tell me where it is or share it with us? Thanks a lot.

Hello. Congratulations on your great paper ! I'm having some trouble downloading the NABirds dataset, Can you release this dataset? Looking forward to your reply.

Hello, congratulations! Thank you for released VTAB-1K, it help me a lot. I tried to experiment on CIFAR100, but found that the results of SSF and VPT were far worse...

Hi, I tried to reproduce the results for VTAB given in paper, but I have the following results: The results are pretty different to given ones. And here is my...

In the process of reading the paper, I don't quite understand the meaning of Re-parameterization. From the code, SSF-ADA is added to different modules. I don't understand the difference and...

Thank you so much for sharing the code. I tried running train_scripts/vit/vtab/dtd/train_ssf.sh and found that the result could not match the number on the paper. At the end of the...

In train.py the argument ```python parser.add_argument('--tuning-mode', default=None, type=str, help='Method of fine-tuning (default: None') ``` is later passed to `create_model` method ```python model = create_model( args.model, pretrained=args.pretrained, num_classes=args.num_classes, drop_rate=args.drop, drop_connect_rate=args.drop_connect, #...