Attention_Network_With_Keras
Attention_Network_With_Keras copied to clipboard
Duplicate node name in graph
Hi,
Thank you for your example.
I'm trying to use this Attention example in my LSTM model. However, in def attention layer
, the line
h = Lambda(lambda X: K.zeros(shape=(K.shape(X)[0], n_h)))(X)
caused an error ValueError: Duplicate node name in graph: 'lambda_16/zeros/packed'
Do you know how to solve it? Thanks!
Sorry for not responding earlier - have you resolved this issue? I personally have not used TF in over a year (having switched to PyTorch), though I vaguely recall this being an issue with re-running network setup without restating the jupyter notebook.
If you're still interested in resolving this and have a set of steps to reproduce this, I can probably solve this much quicker.
Hello ,I also meet this problem, searching a lot but failed to solve it, sincerely to get your help.