residual-attention-network icon indicating copy to clipboard operation
residual-attention-network copied to clipboard

Residual Attention Network for Image Classification

Results 18 residual-attention-network issues
Sort by recently updated
recently updated
newest added

Hi, I would like to know how to plot the feature map with attention, similar in your paper. Would you like to share the code about how to plot those...

hello, when I make caffe, I meet this problem > ./include/caffe/common.cuh(9): error: function "atomicAdd(double *, double)" has already been defined > > 1 error detected in the compilation of "/tmp/tmpxft_000040a6_00000000-11_interp.compute_61.cpp1.ii"....

Hello, I was wondering if we can have the solvers you use to train your networks. Thanks!

in your residual-attention-network caffe , the end of interp.cpp is : INSTANTIATE_CLASS(InterpLayer); //REGISTER_LAYER_CLASS(InterpLayer); which means you comment the InterpLayer; while in your provided caffe(https://github.com/fwang91/caffe/blob/master/src/caffe/layers/interp.cpp)code, the end of interp.cpp is :...

According to your paper, the "Interp" layer does a bilinear interpolation to upsample its previous layer's output. But "Interp" layer has two inputs in your implementation. I'm not very familiar...

Interested in your paper, I find some bugs when I compile your caffe. My environment is Centos 7.5, Cuda 8.0, Cudnn5.1. The following is the fix: ``` diff --git a/include/caffe/common.cuh...