2018-daguan-competition icon indicating copy to clipboard operation
2018-daguan-competition copied to clipboard

rnncapsule出现了一些问题。

Open ZhengDanYang1 opened this issue 4 years ago • 2 comments

在运行rnncapsule时出现了以下报错。

ValueError Traceback (most recent call last) in ----> 1 model = Gru_Capsule_Model(word_seq_len, word_embedding,classification)

in Gru_Capsule_Model(sent_length, embeddings_weight, class_num) 25 embed = SpatialDropout1D(0.2)(embedding(content)) 26 x = Bidirectional(CuDNNGRU(200, return_sequences=True))(embed) ---> 27 capsule = Capsule(num_capsule=Num_capsule, dim_capsule=Dim_capsule, routings=Routings, share_weights=True)(x) 28 capsule = Flatten()(capsule) 29 x = Dense(1000)(capsule)

~/anaconda3/lib/python3.7/site-packages/keras/engine/base_layer.py in call(self, inputs, **kwargs) 487 # Actually call the layer, 488 # collecting output(s), mask(s), and shape(s). --> 489 output = self.call(inputs, **kwargs) 490 output_mask = self.compute_mask(inputs, previous_mask) 491

~/ZDY/2018-daguan-competition-master/biGruModel/glove/util.py in call(self, u_vecs) 95 outputs = self.activation(K.batch_dot(c, u_hat_vecs, [2, 2])) 96 if i < self.routings - 1: ---> 97 b = K.batch_dot(outputs, u_hat_vecs, [2, 2]) 98 99 return outputs

~/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py in batch_dot(x, y, axes) 1497 str(x_shape) + ' and ' + str(y_shape) + 1498 ' with axes=' + str(axes) + '. x.shape[%d] != ' -> 1499 'y.shape[%d] (%d != %d).' % (axes[0], axes[1], d1, d2)) 1500 1501 # backup ndims. Need them later.

ValueError: Can not do batch_dot on inputs with shapes (None, 10, 10, 16) and (None, 10, None, 16) with axes=[2, 3]. x.shape[2] != y.shape[3] (10 != 16).

ZhengDanYang1 avatar Dec 02 '19 14:12 ZhengDanYang1

在运行rnncapsule时出现了以下报错。

ValueError Traceback (most recent call last) in ----> 1 model = Gru_Capsule_Model(word_seq_len, word_embedding,classification)

in Gru_Capsule_Model(sent_length, embeddings_weight, class_num) 25 embed = SpatialDropout1D(0.2)(embedding(content)) 26 x = Bidirectional(CuDNNGRU(200, return_sequences=True))(embed) ---> 27 capsule = Capsule(num_capsule=Num_capsule, dim_capsule=Dim_capsule, routings=Routings, share_weights=True)(x) 28 capsule = Flatten()(capsule) 29 x = Dense(1000)(capsule)

~/anaconda3/lib/python3.7/site-packages/keras/engine/base_layer.py in call(self, inputs, **kwargs) 487 # Actually call the layer, 488 # collecting output(s), mask(s), and shape(s). --> 489 output = self.call(inputs, **kwargs) 490 output_mask = self.compute_mask(inputs, previous_mask) 491

~/ZDY/2018-daguan-competition-master/biGruModel/glove/util.py in call(self, u_vecs) 95 outputs = self.activation(K.batch_dot(c, u_hat_vecs, [2, 2])) 96 if i < self.routings - 1: ---> 97 b = K.batch_dot(outputs, u_hat_vecs, [2, 2]) 98 99 return outputs

~/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py in batch_dot(x, y, axes) 1497 str(x_shape) + ' and ' + str(y_shape) + 1498 ' with axes=' + str(axes) + '. x.shape[%d] != ' -> 1499 'y.shape[%d] (%d != %d).' % (axes[0], axes[1], d1, d2)) 1500 1501 # backup ndims. Need them later.

ValueError: Can not do batch_dot on inputs with shapes (None, 10, 10, 16) and (None, 10, None, 16) with axes=[2, 3]. x.shape[2] != y.shape[3] (10 != 16).

是不是keras版本不对?

hecongqing avatar Dec 03 '19 02:12 hecongqing

谢谢,我尝试一下。

ZhengDanYang1 avatar Dec 03 '19 03:12 ZhengDanYang1