ESIM
ESIM copied to clipboard
what is the accuracy
i just want to konw the accuracy of your ESIM model? Have you got 88%?
lt is hard to got 88%, l didn't got accuracy above 75%, is lt the model implement problems?
i just want to konw the accuracy of your ESIM model? Have you got 88%?
This code can't reach 88%, there are some problems with this version code (the mask of attention weight [Model.py, line 160-170] and the mask of mean and max [Model.py, line 220-225]), but even though I fix these bugs, it only got accuracy 84.+% (maybe partly due to different hyperparameters), so I didn't update my code in this repo.
i just want to konw the accuracy of your ESIM model? Have you got 88%?
This code can't reach 88%, there are some problems with this version code (the mask of attention weight [Model.py, line 160-170] and the mask of mean and max [Model.py, line 220-225]), but even though I fix these bugs, it only got accuracy 84.+% (maybe partly due to different hyperparameters), so I didn't update my code in this repo.
how to fix the bugs above, updated code will be appreciated,thanks
i just want to konw the accuracy of your ESIM model? Have you got 88%?
This code can't reach 88%, there are some problems with this version code (the mask of attention weight [Model.py, line 160-170] and the mask of mean and max [Model.py, line 220-225]), but even though I fix these bugs, it only got accuracy 84.+% (maybe partly due to different hyperparameters), so I didn't update my code in this repo.
how to fix the bugs above, updated code will be appreciated,thanks
Ok, I will update it recently. The code on the server have been modified for another research task, I will change it back and test the results in my winter vacation.
By adding the mask op, I got the best acc as follow: 2019-01-16 15:01:00 Epoch:23; train: acc:0.9003, macro-f1:0.9003; 2019-01-16 15:01:11 Epoch:23; dev: acc:0.8659, macro-f1:0.8657; 2019-01-16 15:01:21 Epoch:23; test: acc:0.8649, macro-f1:0.8647;
The main hyperparameters is set likes follow: self.embedding_normalize = 1 self.with_length_mask = True self.pad_max_length = pad_len # 25 self.hidden_size = 200 # For BiLSTM hidden_size, feedforward hidden_size; self.learning_rate = 0.05 self.clip_value = None self.l2_lambda = 0.0 And then I will run the code I fixed, on your paper's hyperparameters.
By adding the mask op, I got the best acc as follow: 2019-01-16 15:01:00 Epoch:23; train: acc:0.9003, macro-f1:0.9003; 2019-01-16 15:01:11 Epoch:23; dev: acc:0.8659, macro-f1:0.8657; 2019-01-16 15:01:21 Epoch:23; test: acc:0.8649, macro-f1:0.8647;
The main hyperparameters is set likes follow: self.embedding_normalize = 1 self.with_length_mask = True self.pad_max_length = pad_len # 25 self.hidden_size = 200 # For BiLSTM hidden_size, feedforward hidden_size; self.learning_rate = 0.05 self.clip_value = None self.l2_lambda = 0.0 And then I will run the code I fixed, on your paper's hyperparameters.
hi,i adding the mask op and i trained the model by the hpyerparemeters provided by your post above , but i didnt gei the results. i got the test acc 68.89 and f1 68.91. i wanna know how u eidt this code, can i have a look of ur code my email is [email protected] look forward ur reply
By adding the mask op, I got the best acc as follow: 2019-01-16 15:01:00 Epoch:23; train: acc:0.9003, macro-f1:0.9003; 2019-01-16 15:01:11 Epoch:23; dev: acc:0.8659, macro-f1:0.8657; 2019-01-16 15:01:21 Epoch:23; test: acc:0.8649, macro-f1:0.8647; The main hyperparameters is set likes follow: self.embedding_normalize = 1 self.with_length_mask = True self.pad_max_length = pad_len # 25 self.hidden_size = 200 # For BiLSTM hidden_size, feedforward hidden_size; self.learning_rate = 0.05 self.clip_value = None self.l2_lambda = 0.0 And then I will run the code I fixed, on your paper's hyperparameters.
hi,i adding the mask op and i trained the model by the hpyerparemeters provided by your post above , but i didnt gei the results. i got the test acc 68.89 and f1 68.91. i wanna know how u eidt this code, can i have a look of ur code my email is [email protected] look forward ur reply
The code has been sent to your email, pay attention to check it.
By adding the mask op, I got the best acc as follow: 2019-01-16 15:01:00 Epoch:23; train: acc:0.9003, macro-f1:0.9003; 2019-01-16 15:01:11 Epoch:23; dev: acc:0.8659, macro-f1:0.8657; 2019-01-16 15:01:21 Epoch:23; test: acc:0.8649, macro-f1:0.8647; The main hyperparameters is set likes follow: self.embedding_normalize = 1 self.with_length_mask = True self.pad_max_length = pad_len # 25 self.hidden_size = 200 # For BiLSTM hidden_size, feedforward hidden_size; self.learning_rate = 0.05 self.clip_value = None self.l2_lambda = 0.0 And then I will run the code I fixed, on your paper's hyperparameters.
hi, I'm a beginner of nlp, and I trained the model by my hyperparameters but it can't get great results, test acc only have 65%. So I also want to know how you fix the code, can I have a look of your code too. my email is [email protected]. look forward to your reply, thank you!
By adding the mask op, I got the best acc as follow: 2019-01-16 15:01:00 Epoch:23; train: acc:0.9003, macro-f1:0.9003; 2019-01-16 15:01:11 Epoch:23; dev: acc:0.8659, macro-f1:0.8657; 2019-01-16 15:01:21 Epoch:23; test: acc:0.8649, macro-f1:0.8647; The main hyperparameters is set likes follow: self.embedding_normalize = 1 self.with_length_mask = True self.pad_max_length = pad_len # 25 self.hidden_size = 200 # For BiLSTM hidden_size, feedforward hidden_size; self.learning_rate = 0.05 self.clip_value = None self.l2_lambda = 0.0 And then I will run the code I fixed, on your paper's hyperparameters.
hi,i adding the mask op and i trained the model by the hpyerparemeters provided by your post above , but i didnt gei the results. i got the test acc 68.89 and f1 68.91. i wanna know how u eidt this code, can i have a look of ur code my email is [email protected] look forward ur reply
The code has been sent to your email, pay attention to check it.
I also want to know how you fix the code. my email is [email protected] . thank you
I also want to know how you fix the code. i try the code, but ........ my email is [email protected] . thank you