eo-flow icon indicating copy to clipboard operation
eo-flow copied to clipboard

Incorrectly implemented MultiHeadAttention in PseTae

Open sbgeophd opened this issue 2 years ago • 2 comments

The implementation of the PseTae MultiHeadAttention appears to have a mistake.

The original implementation applies 2 fully connected layers on the query tensor (see here: https://github.com/VSainteuf/pytorch-psetae/blob/master/models/tae.py#L133)

However, in the eo-flow implementation, while both fully connected layers are defined, the 2nd (defined here: https://github.com/sentinel-hub/eo-flow/blob/master/eoflow/models/pse_tae_layers.py#L50) is not used as would be expected here: https://github.com/sentinel-hub/eo-flow/blob/master/eoflow/models/pse_tae_layers.py#L66 (indeed, it is not used at all in the code).

sbgeophd avatar Oct 05 '21 19:10 sbgeophd

Hi @sbgeophd !

Yes, you are correct, the second connected layer is not used. I can't recall if that was done on purpose or not, but I'd guess we just overlooked this part.

Do you have the capacity to make a pull request fixing this issue? If not, we could do it but it might not happen very soon.

Thanks for reporting the issue.

devisperessutti avatar Oct 06 '21 09:10 devisperessutti

I'm in theory happy to make a pull request, but while I think I've fixed it, the model still doesn't work for me. I'll submit a pull request if/when I get it working.

sbgeophd avatar Oct 06 '21 18:10 sbgeophd