ESIM icon indicating copy to clipboard operation
ESIM copied to clipboard

TensorFlow implementation of the ESIM model (Enhanced LTSM for natural language inference)

Results 8 ESIM issues
Sort by recently updated
recently updated
newest added

i just want to konw the accuracy of your ESIM model? Have you got 88%?

https://github.com/HsiaoYetGun/ESIM/blob/master/Model.py#L169 ``` python attentionSoft_b = tf.nn.softmax(tf.transpose(attentionWeights)) ``` 这里对attentionWeights进行transpose后,生成的张量的形状为 ( seq_length, seq_length, batch_size ) 然后在对上一步的结果进行softmax,tf.nn.softmax默认在最后一个维度作softmax, 那岂不是在batch上作softmax ?求相互指教。

Traceback (most recent call last): File "Train.py", line 165, in train() File "Train.py", line 94, in train _, batch_loss, batch_acc = sess.run([model.train, model.loss, model.acc], feed_dict=feed_dict) File "/home/prf/anaconda3/envs/prfenv/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 929, in...

lstm需要的输入的应该是[seq_len,batch_size,emb_size],代码里面需要transpose一下

ub16c9@ub16c9-gpu:/media/ub16c9/fcd84300-9270-4bbd-896a-5e04e79203b7/ub16_prj/ESIM-tf$ python3.6 Train.py Using TensorFlow backend. CMD : python3 Train.py --num_epochs 300 --batch_size 32 --dropout_keep_prob 0.5 --clip_value 10 --learning_rate 0.0004 --l2 0.0 --seq_length 100 --optimizer adam --early_stop_step 5000000 --threshold 0...

Hi there, Thanks for sharing the code. For attention part in **model.py**, your code is: ``` attentionSoft_b = tf.nn.softmax(tf.transpose(attentionWeights)) attentionSoft_b = tf.transpose(attentionSoft_b) ``` while I feel like it should be:...

thanks for sharing ,but i found something puzzling in Utils.py[line 193],is it a bug? ![image](https://user-images.githubusercontent.com/40337309/50218638-8ff3b700-03c7-11e9-82aa-34bafba07d7a.png)

hello! what is problem? ![image](https://user-images.githubusercontent.com/45895061/50547877-46715a80-0c58-11e9-89e6-f82984d810c2.png)