openset-DA
openset-DA copied to clipboard
question about the setting of lambd(constant) for grad_reverse
In the office code of OSBP the p(constant) is set to 1 and then -1*grad to inverse the grad, but in your code this seems to be converge to 1 by the code in 88,89 in train.py p = global_step / total_steps constant = 2. / (1. + np.exp(-10 * p)) - 1 What is the reason for this operation and what is the meaning of grl-rampup-epochs? thanks for your reply!
It is a common operation when using gradient reversal. See Eq. 14 in the following paper. https://arxiv.org/pdf/1409.7495.pdf
It is a common operation when using gradient reversal. See Eq. 14 in the following paper. https://arxiv.org/pdf/1409.7495.pdf
thanks for your reply.