perceptual-reflection-removal icon indicating copy to clipboard operation
perceptual-reflection-removal copied to clipboard

exclusion loss

Open crowaffle opened this issue 4 years ago • 0 comments

according to the formulation of Exclusion loss in your paper, you set the normalization factors Lambda_T and Lambda_R.

I guess function "compute_exclusion_loss" in main.py computes the exclusion loss.

In that function, It seems that alphax and alphay are calculated for the normalization.

however, after that line , I think they are used only for Lambda_R. where is Lambda_T for normalizing the absolute value of T's gradient?

crowaffle avatar May 23 '20 12:05 crowaffle