residual-attention-network icon indicating copy to clipboard operation
residual-attention-network copied to clipboard

Confused of Interp layer

Open Queequeg92 opened this issue 7 years ago • 2 comments

According to your paper, the "Interp" layer does a bilinear interpolation to upsample its previous layer's output. But "Interp" layer has two inputs in your implementation. I'm not very familiar with caffe. Could you provide some documentations of "Interp" layer in caffe? Is there any alternatives in tensorflow or pytorch?

Queequeg92 avatar Sep 25 '17 09:09 Queequeg92

The first input to the layer is the Tensor to be resized, the second input dictates the target size. In Tensorflow, you can use tf.image.resize_images with Bilinear interpolation.

ondrejbiza avatar Sep 26 '17 08:09 ondrejbiza

Got it! Thanks!

Queequeg92 avatar Sep 26 '17 09:09 Queequeg92