MATEBIT icon indicating copy to clipboard operation
MATEBIT copied to clipboard

No ReLU Mask Code in your MAT Module.

Open 8414sys opened this issue 5 months ago • 0 comments

You have written about the utility of ReLU Mask in your paper. The content was briefly that ReLU Mask, which does not require learning parameters, performs better than DynaST's learnable MLP. The implementation of that part is in models/networks/dynast_transformer.py. I don't think your code has changed at all compared to the original code in DynaST, is that correct? If correct, should I simply apply the ReLU function to the output instead of the corresponding code?

8414sys avatar Sep 02 '24 08:09 8414sys