NARRE icon indicating copy to clipboard operation
NARRE copied to clipboard

Running train.py

Open vr25 opened this issue 6 years ago • 3 comments

I am using TensorFlow 0.12.1 and Python 2.7.12 (as mentioned) but I am still running into the following issue:

:~/Downloads/NARRE-master/model$ python train.py

Parameters: ALLOW_SOFT_PLACEMENT=True BATCH_SIZE=100 DROPOUT_KEEP_PROB=0.5 EMBEDDING_DIM=300 FILTER_SIZES=3 L2_REG_LAMBDA=0.001 LOG_DEVICE_PLACEMENT=False NUM_EPOCHS=40 NUM_FILTERS=100 PARA_DATA=../data/music/music.para TRAIN_DATA=../data/music/music.train VALID_DATA=../data/music/music.test WORD2VEC=../data/google.bin

Loading data... 5541 3568 16 446 32 446 Traceback (most recent call last): File "train.py", line 144, in n_latent=32) File "/home/rawtev/Downloads/NARRE-master/model/NARRE.py", line 122, in init tf.einsum('ajk,kl->ajl', self.h_drop_u, Wau) + tf.einsum('ajk,kl->ajl', self.iid_a, Wru) + bau), File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/special_math_ops.py", line 212, in einsum axes_to_sum) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/special_math_ops.py", line 341, in _einsum_reduction product = _reshape_if_necessary(product, uncompacted_shape) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/special_math_ops.py", line 366, in _reshape_if_necessary return array_ops.reshape(tensor, new_shape) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/gen_array_ops.py", line 2448, in reshape name=name) File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/op_def_library.py", line 493, in apply_op raise err TypeError: Expected binary or unicode string, got None

Thanks a lot!

vr25 avatar Sep 22 '18 16:09 vr25

@vr25 Hello ,have you solved this problems?

sshzhang avatar Oct 23 '18 01:10 sshzhang

@vr25 I solved it you can replace them by follow code. In fact , this may be the python version problems

self.u_j = tf.einsum('ajk,kl->ajl', tf.nn.relu(

        #     tf.einsum('ajk,kl->ajl', self.h_drop_u, Wau) + tf.einsum('ajk,kl->ajl', self.iid_a, Wru) + bau),
        #                      Wpu) + bbu  # None,review_num_u,1
        print self.h_drop_u.get_shape()
        print Wau.get_shape()

        sm111=tf.reshape(self.h_drop_u, shape=[-1, num_filters_total])
        print num_filters_total
        print tf.shape(sm111)
        sm11=tf.matmul(sm111, Wau)
        sm1=tf.reshape(sm11,shape=[-1,review_num_u,attention_size])
        sm2=tf.reshape(tf.matmul(tf.reshape(self.iid_a,shape=[-1,embedding_id]),Wru),shape=[-1,review_num_u,attention_size])
        sm3=tf.nn.relu(sm1+sm2+bau)
        print tf.shape(sm3)
        print attention_size
        self.u_j=tf.reshape(tf.matmul(tf.reshape(sm3,shape=[-1,attention_size]),Wpu),shape=[-1,review_num_u,1])+bbu

sshzhang avatar Oct 25 '18 07:10 sshzhang

@sshzhang It works, but there is another problem behind the line 145, which is same issue. Therefore, I write the substituted code for that error.

# self.i_j =tf.einsum('ajk,kl->ajl', tf.nn.relu(tf.einsum('ajk,kl->ajl', self.h_drop_i, Wai) + tf.einsum('ajk,kl->ajl', self.uid_a, Wri) + bai), Wpi)+bbi
sm111 = tf.reshape(self.h_drop_i, shape=[-1, num_filters_total])
sm11 = tf.matmul(sm111, Wai)
sm1 = tf.reshape(sm11, shape=[-1, review_num_i, attention_size])
sm2 = tf.reshape(tf.matmul(tf.reshape(self.uid_a, shape=[-1, embedding_id]), Wri), shape=[-1, revi    ew_num_i, attention_size])
sm3 = tf.nn.relu(sm1+sm2+bai)
self.i_j = tf.reshape(tf.matmul(tf.reshape(sm3, shape=[-1, attention_size]), Wpi), shape=[-1, revi    ew_num_i, 1]) + bbi

sh0416 avatar Jan 03 '20 06:01 sh0416