sits icon indicating copy to clipboard operation
sits copied to clipboard

average attention mask

Open vparmain opened this issue 2 years ago • 1 comments

Describe the requested improvement In their paper on L-TAE (https://arxiv.org/pdf/2007.00586.pdf), authors describe a method called "attention mask" which aim at situating the discriminant source of data for each class.

Associated sits API function Is there a way to implement this approch in sits?

vparmain avatar Jan 27 '23 11:01 vparmain

Dear @vparmain

The main difference between the TAE and LightTAE methods proposed by Garnot et al. is the use of the average attention mask to reduce the number of parameters in the model. Both TAE and LightTAE are available on sits using the sits_tae() and sits_lighttae() functions.

To implement sits_lighttae() we used the papers by Vivien Garnot and reference code at https://github.com/VSainteuf/lightweight-temporal-attention-pytorch

We also used the code made available by Maja Schneider at https://github.com/maja601/RC2020-psetae

Please let us know if this is what you requested.

gilbertocamara avatar Jan 30 '23 12:01 gilbertocamara