Double-Attention-Network icon indicating copy to clipboard operation
Double-Attention-Network copied to clipboard

This is the PyTorch implementation of Double Attention Network, NIPS 2018

Double-Attention-Network

This is the PyTorch implementation of A^2-Nets: Double Attention Networks, Y Chen et al NIPS 2018

It can be used as an additional block for building models. Right now, the output tensor has shape (B, c_n, H, W). One can re-construct the original shape (B, c, H, W) with a single line of code in PyTorch.

Layer architecture

  • Two attention steps alt text