blse icon indicating copy to clipboard operation
blse copied to clipboard

How to get the result in paper

Open MaCYbupt opened this issue 7 years ago • 4 comments

I use parameter_search to get the best parameter, but can't reach the best result proposed in your paper, is there any guide to help me get the result ?

MaCYbupt avatar Oct 30 '18 07:10 MaCYbupt

Hi, sure I'd love to help. Can you give me a bit more information about the system you're running the experiments on, the embeddings you use, and the current results?

jerbarnes avatar Nov 13 '18 12:11 jerbarnes

I run the experiment on torch 0.2.0_3 , and use the google.txt for English word embeddings, sg-300-es for Spanish word embeddings,bilingual dictionary from Bingliu,but I just get the best result when batch size is 80 and alpha 0.8,with f1=0.540.

MaCYbupt avatar Nov 17 '18 12:11 MaCYbupt

Besides this, I think I should tell you that I always get the best results at epoch 1, is it wired?

MaCYbupt avatar Nov 17 '18 12:11 MaCYbupt

Besides this, I think I should tell you that I always get the best results at epoch 1, is it wired?

I had the same problem trying to run blse.py and I think I found the main reason this happens.

This line: out = F.softmax(self.clf(x_proj)) should be replaced by this one: out = self.clf(x_proj)

Since the criterion being used for calculating the loss is nn.CrossEntropyLoss() and according to pytorch docs:

This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class

Which means that softmax is being calculated twice on the model outputs, making it hard for any learning to occur.

Be happy to hear if you think I am missing something @jbarnesspain. Thank you for the article and implementation!

UriSha avatar Jan 16 '19 14:01 UriSha