Liu Zhihui

Results 45 comments of Liu Zhihui

@ShomyLiu @JankinXu may i take a look at your code? do you use L2 normal regularization?

@ShomyLiu do you normalize your pretrained word embedding? I didn't normalize it in my code. And make sure you add L2 regularization to your `out_linear`.

@ShomyLiu In my experience, L2 regularization in out_linear layer helps a lot.

Yes, you're right. It's a mistake. Thanks.

total_loss = task_loss + adv_loss + diff_loss + l2_loss

there is a function `flip_gradient` to maximize the adv_loss

你为啥要build_data呢?我应该生成数据了吧,按README里的步骤不能运行吗?

`embed300.trim.npy` is trimmed from [google news word2vec](https://drive.google.com/file/d/0B7XkCwpI5KDYNlNUTTlSS21pQmM/edit?usp=sharing). The origin file is too large, so I didn't upload it.

you can use `--word_dim=50` to use pre-trained 50 dim senna embeddings

I think it is a hyper-parameter. I choose one with higher F1 score.