openset-DA icon indicating copy to clipboard operation
openset-DA copied to clipboard

question about the setting of lambd(constant) for grad_reverse

Open fuyimin96 opened this issue 3 years ago • 2 comments

In the office code of OSBP the p(constant) is set to 1 and then -1*grad to inverse the grad, but in your code this seems to be converge to 1 by the code in 88,89 in train.py p = global_step / total_steps constant = 2. / (1. + np.exp(-10 * p)) - 1 What is the reason for this operation and what is the meaning of grl-rampup-epochs? thanks for your reply!

fuyimin96 avatar Oct 09 '21 06:10 fuyimin96

It is a common operation when using gradient reversal. See Eq. 14 in the following paper. https://arxiv.org/pdf/1409.7495.pdf

YU1ut avatar Oct 09 '21 07:10 YU1ut

It is a common operation when using gradient reversal. See Eq. 14 in the following paper. https://arxiv.org/pdf/1409.7495.pdf

thanks for your reply.

fuyimin96 avatar Oct 09 '21 07:10 fuyimin96