MVSS-Net
MVSS-Net copied to clipboard
Implement details of Sobel Layer?
Thanks for your work.
As the paper show, it consists four sublayer. But I wonder how the sobel result of x and y be fusioned together?
DOES THE CODE BELOW RIGHT?
def forward(self, x):
sobel_x = F.conv2d(x, sobel_kernel_x, padding=1, bias=False)
sobel_y = F.conv2d(y, sobel_kernel_y, padding=1, bias=False)
sobel_rs = torch.pow(torch.pow(sobel_x, 2) + torch.pow(sobel_y, 2), 0.5)
sobel_rs = F.normalize(x, p=2)
sobel_rs = self.bn(sobel_rs)
sobel_rs = F.sigmoid(sobel_rs)
sobel_rs = x * sobel_rs
As the sobel conv is the fundamental operation of your ESB, I'm waiting for your response. Thanks in advance.
Almost there, except for the sequence of l2 norm and batchnorm2d, which is corrected in the lately version (see https://arxiv.org/abs/2104.06832).
Our implement sobel together with rest module are still under inner review.
Sorry, but I still have a question. What's the meaning of the L2 norm? As your latest version show that the l2norm is append behind the BN layer. But the output of sobel layer and bn is B, 1, H, W. L2Norm will let the value all to 1? Besides, how does the sobel result of x and y be fusioned. Note that sqrt(x^2, y^2) may cause gradient explosion.
In fact we apply L2 norm as the fusion function of x and y, thus the channel dimension to be 1:
sobel_rs = torch.sqrt(torch.pow(nn.BatchNorm2d(sobel_x), 2) + torch.pow(nn.BatchNorm2d(sobel_y), 2))
The code will be released soon.
hello there! I trained my model these days, and could you plz roughly tell me the final loss on CASIA_v2 ? I wanna know whether it is converged.