Siamese-LSTM
Siamese-LSTM copied to clipboard
hi, when I set training=True, error arised, type mismatching?
mike@Vostro-3653:~/works/Siamese-LSTM$ python main.py
Loading Word2Vec
Traceback (most recent call last):
File "main.py", line 6, in <module>
sls=lstm("bestsem.p",load=True,training=True)
File "/home/mike/works/Siamese-LSTM/lstm.py", line 299, in __init__
self.f_grad_shared, self.f_update = adadelta(lr, tnewp, grads,emb11,mask11,emb21,mask21,y, cost)
File "/home/mike/works/Siamese-LSTM/lstm.py", line 188, in adadelta
name='adadelta_f_grad_shared')
File "/home/mike/.local/lib/python2.7/site-packages/theano/compile/function.py", line 320, in function
output_keys=output_keys)
File "/home/mike/.local/lib/python2.7/site-packages/theano/compile/pfunc.py", line 442, in pfunc
no_default_updates=no_default_updates)
File "/home/mike/.local/lib/python2.7/site-packages/theano/compile/pfunc.py", line 207, in rebuild_collect_shared
raise TypeError(err_msg, err_sug)
TypeError: ('An update must have the same type as the original shared variable (shared_var=1lstm1_U_rgrad2, shared_var.type=TensorType(float32, matrix), update_val=Elemwise{add,no_inplace}.0, update_val.type=TensorType(float64, matrix)).', 'If the difference is related to the broadcast pattern, you can call the tensor.unbroadcast(var, axis_to_unbroadcast[, ...]) function to remove broadcastable dimensions.')
please set theano_flags: THEANO_FLAGS=floatX=float32,device=gpu (etc.)