Raymond
Raymond
you can follow the instructions at this link https://github.com/dragen1860/MAML-Pytorch. Then modify miniimagenet.py line 53: names = [f for f in os.listdir(self.dir_path) if f.endswith('.jpg')]
what do you mean by swin-based CycleMLP. Swin transformer already has attention block, do you mean replace MLP with cycleMlp
i think this is a good opinion.
yes, I think it is "partially quantized ViT".
I got the same problem, and it just got stuck at the while loop.
have you fixed it yet?
Thank you, I'll try it.
yes, this will cause deading loops and I think we should manually initialize that value randomly.
does random seed get changed in swin_base model