Universal-Domain-Adaptation icon indicating copy to clipboard operation
Universal-Domain-Adaptation copied to clipboard

why normalize_weight need div torch.mean(x)?

Open alpc91 opened this issue 4 years ago • 1 comments

def normalize_weight(x): min_val = x.min() max_val = x.max() x = (x - min_val) / (max_val - min_val) x = x / torch.mean(x) return x.detach()

according to paper, x in (0,1), why normalize_weight need div torch.mean(x)?

alpc91 avatar Jun 02 '21 03:06 alpc91

ce = nn.CrossEntropyLoss(reduction='none')(predict_prob_source, label_source) And why feed after_sotmax(predict_prob_source) to nn.CrossEntropyLoss? This criterion has combined LogSoftmax and NLLLoss.

alpc91 avatar Jun 02 '21 04:06 alpc91