mathor

Results 8 comments of mathor

> Sorry, I just curious to know what is` request Header`? you can f12, and then see the "Network"

maybe you can see this code https://github.com/wmathor/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.ipynb

> Excuse me, https://github.com/graykode/nlp-tutorial/blob/master/1-1.NNLM/NNLM-Torch.py#L50 The comment here may be wrong. It should be `X = X.view(-1, n_step * m) # [batch_size, n_step * m]` > > Sorry for disturbing you....

hey bro, i found this error too i think you are right

加大目标类的权重利用的是$(1-\hat{y})^{\gamma}$这一项,减少背景类的权重利用的是$\hat{y}^{\gamma}$这一项。但是随着训练不断迭代,因为这两项权重,结果形式逆转了,现在目标类反而权重特别大,所以要给目标类降权,即给目标类再乘一个较小的缩放因子$\alpha$,那么目标类应该乘0.25,背景类乘0.75才对 我是从苏剑林大佬的博客看到的,[文章链接](https://kexue.fm/archives/4733),相关内容截图如下

> `www.___.com` 这是一道命题, 你能帮忙解答一下吗 > > (各位有车可以分享一下哈, 造福人类~) magazinelib