Transformer-SSL icon indicating copy to clipboard operation
Transformer-SSL copied to clipboard

This is an official implementation for "Self-Supervised Learning with Swin Transformers".

Results 17 Transformer-SSL issues
Sort by recently updated
recently updated
newest added

When I used moby_main for training, Linux memory grew until it crashed. What is the reason and how to solve it The error is: Traceback (most recent call last): File...

Hi there, I've already trained my swin transformer with your proposed SSL method and have the checkpoints saved. I'm now trying to load my model as the backbone of a...

Hi there, Congrats for the nice work and thanks for providing the code. I have a question about the experiments you conducted on downstream tasks (detection and segmentation). For the...

> from timm.models import vit_deit_small_patch16_224 ImportError: cannot import name 'vit_deit_small_patch16_224' from 'timm.models' (/home/michuan.lh/miniconda3/envs/moby/lib/python3.7/site-packages/timm/models/__init__.py) Thanks for your work. When I run your code, I got an error that cannot import vit_deit_small_patch16_224.

in models/build.py:41, the keyword passed to partial(SwinTransformer,...) is norm_befor_mlp, while the keyword in SwinTransformer (models/swin_transformer.py:497) is norm_before_mlp. The formmer missed a letter E compared with the latter. Therefore, the 'bn'...

Thanks for your work! As shown in the markdown file, we can now pretrain Transformer-SSL via 8 GPUs and 1 node. Do you have scripts for multi-machine training? I want...