text_classification icon indicating copy to clipboard operation
text_classification copied to clipboard

a1_seq2seq_attention_train

Open tangdouer opened this issue 6 years ago • 1 comments

当运行a1_seq2seq_attention_train.py文件时 遇见下面的错误。希望得到您的帮助。 ValueError: Variable W_initial_state1 already exists, disallowed. Did you mean to set reuse=True in VarScope? Originally defined at:

File "/home/qiu/PycharmProjects/text_classification-master/a06_Seq2seqWithAttention/a1_seq2seq_attention_model.py", line 173, in instantiate_weights self.W_initial_state1 = tf.get_variable("W_initial_state1", shape=[self.hidden_size, self.hidden_size*2], initializer=self.initializer) File "/home/qiu/PycharmProjects/text_classification-master/a06_Seq2seqWithAttention/a1_seq2seq_attention_model.py", line 40, in init self.instantiate_weights() File "/home/qiu/PycharmProjects/text_classification-master/a06_Seq2seqWithAttention/a1_seq2seq_attention_model.py", line 233, in test vocab_size, embed_size,hidden_size, is_training,decoder_sent_length=decoder_sent_length,l2_lambda=l2_lambda)

tangdouer avatar Jun 27 '18 11:06 tangdouer

check whether there is a invoke of test method under model.py


发件人: tangdouer [email protected] 发送时间: 2018年6月27日 19:29:48 收件人: brightmart/text_classification 抄送: Subscribed 主题: [brightmart/text_classification] a1_seq2seq_attention_train (#64)

当运行a1_seq2seq_attention_train.py文件时 遇见下面的错误。希望得到您的帮助。 ValueError: Variable W_initial_state1 already exists, disallowed. Did you mean to set reuse=True in VarScope? Originally defined at:

File "/home/qiu/PycharmProjects/text_classification-master/a06_Seq2seqWithAttention/a1_seq2seq_attention_model.py", line 173, in instantiate_weights self.W_initial_state1 = tf.get_variable("W_initial_state1", shape=[self.hidden_size, self.hidden_size*2], initializer=self.initializer) File "/home/qiu/PycharmProjects/text_classification-master/a06_Seq2seqWithAttention/a1_seq2seq_attention_model.py", line 40, in init self.instantiate_weights() File "/home/qiu/PycharmProjects/text_classification-master/a06_Seq2seqWithAttention/a1_seq2seq_attention_model.py", line 233, in test vocab_size, embed_size,hidden_size, is_training,decoder_sent_length=decoder_sent_length,l2_lambda=l2_lambda)

― You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHubhttps://github.com/brightmart/text_classification/issues/64, or mute the threadhttps://github.com/notifications/unsubscribe-auth/ASuYMGxs5cK4sR_oTTyo1rrSbM0JQNkUks5uA2ysgaJpZM4U5ihw.

brightmart avatar Jun 27 '18 15:06 brightmart