robintibor
robintibor
so with regard to giving the user some error/warning in case batch size < min batch size, should we do this? I think it is not super trivial, probably on...
yes feel free to merge then @bruAristimunha
Hi, thanks for the answer, just had a brief test of your suggestion but could not make it work: https://colab.research.google.com/drive/1U9s-UCHac6rwJOh8i9_KaAkUffVvE7YF?usp=sharing ```python import torch from sam.sam import SAM import numpy as...
but it would also be possible to change SAM optimizer code to work as you originally planned? to avoid needing a separate classifier class for SAM?
Closing this for now as it is unfinished and doesn't have somebody right now motivated to work on it, can still be reopened if somebody feels interested.
Interesting to me could be: * augmentations, especially channel dropout * random erasing * label smoothing * mipup/cutmix * EMA or SWA (https://github.com/braindecode/braindecode/issues/239)
also an EEG version of TrivialAugment could be interesting. That would mean selecting a subset of augmentations, creating a discrete set of strengths per augmentation and selecting a random augmentation...
wow super interesting thanks @bruAristimunha
Yes you can do that @bruAristimunha . I would suggest to keep the old names as keyword arguments with default value None and a warning that they will be removed,...
closing this as it has diverged a bit much from current code, still very useful for addressing #544