da-faster-rcnn-PyTorch
da-faster-rcnn-PyTorch copied to clipboard
why set the alpha to 0.1?
https://github.com/tiancity-NJU/da-faster-rcnn-PyTorch/blob/47cd8a80f4811a504d4cb57d2b21401ccd2b1151/lib/model/da_faster_rcnn/DA.py#L29 with the alpha=0.1 and arg.lamda=0.1, the total scale seem to be 0.01?
lambda=0.1 for loss lambda*alpha=0.01 for back propagation