MVSS-Net icon indicating copy to clipboard operation
MVSS-Net copied to clipboard

Implement details of Sobel Layer?

Open Uthory opened this issue 3 years ago • 5 comments

Thanks for your work. As the paper show, it consists four sublayer. But I wonder how the sobel result of x and y be fusioned together? image DOES THE CODE BELOW RIGHT?

def forward(self, x):
      sobel_x = F.conv2d(x, sobel_kernel_x, padding=1, bias=False)
      sobel_y = F.conv2d(y, sobel_kernel_y, padding=1, bias=False)
      sobel_rs = torch.pow(torch.pow(sobel_x, 2) + torch.pow(sobel_y, 2), 0.5)
      sobel_rs = F.normalize(x, p=2)
      sobel_rs = self.bn(sobel_rs)
      sobel_rs = F.sigmoid(sobel_rs)
      sobel_rs = x * sobel_rs

Uthory avatar Jul 30 '21 08:07 Uthory

As the sobel conv is the fundamental operation of your ESB, I'm waiting for your response. Thanks in advance.

Uthory avatar Aug 03 '21 09:08 Uthory

Almost there, except for the sequence of l2 norm and batchnorm2d, which is corrected in the lately version (see https://arxiv.org/abs/2104.06832).

Our implement sobel together with rest module are still under inner review.

dong03 avatar Aug 03 '21 14:08 dong03

Sorry, but I still have a question. What's the meaning of the L2 norm? As your latest version show that the l2norm is append behind the BN layer. But the output of sobel layer and bn is B, 1, H, W. L2Norm will let the value all to 1? Besides, how does the sobel result of x and y be fusioned. Note that sqrt(x^2, y^2) may cause gradient explosion.

Uthory avatar Aug 04 '21 03:08 Uthory

In fact we apply L2 norm as the fusion function of x and y, thus the channel dimension to be 1: sobel_rs = torch.sqrt(torch.pow(nn.BatchNorm2d(sobel_x), 2) + torch.pow(nn.BatchNorm2d(sobel_y), 2)) The code will be released soon.

Chenxr1999 avatar Aug 04 '21 05:08 Chenxr1999

hello there! I trained my model these days, and could you plz roughly tell me the final loss on CASIA_v2 ? I wanna know whether it is converged.

Uthory avatar Aug 11 '21 07:08 Uthory