pytorch-recurrent-ae-siggraph17
pytorch-recurrent-ae-siggraph17 copied to clipboard
Something weird in LoG loss
def LoG(img): weight = [ [0, 0, 1, 0, 0], [0, 1, 2, 1, 0], [1, 2, -16, 2, 1], [0, 1, 2, 1, 0], [0, 0, 1, 0, 0] ] weight = np.array(weight)
weight_np = np.zeros((1, 1, 5, 5))
weight_np[0, 0, :, :] = weight
weight_np = np.repeat(weight_np, img.shape[1], axis=1)
weight_np = np.repeat(weight_np, img.shape[0], axis=0)
weight = torch.from_numpy(weight_np).type(torch.FloatTensor).to('cuda:0')
return func.conv2d(img, weight, padding=1)
def HFEN(output, target): return torch.sum(torch.pow(LoG(output) - LoG(target), 2)) / torch.sum(torch.pow(LoG(target), 2))
Hello,
I only see the gaussian filter in this implement, but I find that this loss function in your paper also has Laplace filter. That makes me feel confused.
In addition, I train the network using my dataset, but when I test, there are blue(or purple) color in the edge, I don't know if it's because of what I said before.
Hope you can help me, thank you!
I also found the same issue on the edge. I wonder if there is a method to solve it.
Hope you could help us thank you!
Hi,
@LemonMi: Sorry for sucha later reply. I dont know how I missed this issue in my inbox. And, thank you @deadmarston for re-opening this!
The weight matrix which I have defined is the 2-D Laplacian Of Gaussian (LoG) matrix. Refer to http://fourier.eng.hmc.edu/e161/lectures/gradient/node8.html Refer to page 6 in the paper, which uses the HFEN with LoG kernel. Maybe therer is a better way to do this in pytorch, but at the time, I could only think of this.
About the issue on the edges, I am not sure what could be causing it. I did not encounter such issues during training. Could you both please provide an example?
Thanks, and again, sorry for the extreme late reply.
Hi,
sorry for my late reply. I think the problem is caused by the HFEN loss. In the paper, they mentioned that they used a LoG kernel with σ = 1.5. After utilizing that LoG kernel, the blue(or purple) color on the edge could be greatly reduced.