perceptual-reflection-removal
perceptual-reflection-removal copied to clipboard
exclusion loss
according to the formulation of Exclusion loss in your paper, you set the normalization factors Lambda_T and Lambda_R.
I guess function "compute_exclusion_loss" in main.py computes the exclusion loss.
In that function, It seems that alphax and alphay are calculated for the normalization.
however, after that line , I think they are used only for Lambda_R. where is Lambda_T for normalizing the absolute value of T's gradient?