R-net icon indicating copy to clipboard operation
R-net copied to clipboard

TypeError: An op outside of the function building code is being passed a Graph tensor

Open khaledmustafa91 opened this issue 4 years ago • 0 comments

hello i take this code and i run it on colab when i run it i got these message TypeError: An op outside of the function building code is being passed a Graph tensor i doesn't change any thing in model part code and i noticed that sv.should_stop change from false to true after this line :

        if init: sess.run(model.emb_assign, {model.word_embeddings_placeholder:glove})

because it changed it enter the loop and break without any training

with model.graph.as_default():
config = tf.compat.v1.ConfigProto()
        config.gpu_options.allow_growth = True
        sv = tf.compat.v1.train.Supervisor(logdir=Params.logdir,
                        save_model_secs=0,
                        global_step = model.global_step,
                        init_op = model.init_op)
        print("\n" + str(sv.should_stop()) + "\n")
        with sv.managed_session(config = config) as sess:
            print("\n before sess Here \n")
            **if init: sess.run(model.emb_assign, {model.word_embeddings_placeholder:glove})**
            print("\n after sess Here \n")
            print("\n" + str(sv.should_stop()) + "\n")
            print("\n" + str(init) +"\n")    
            for epoch in range(1, Params.num_epochs+1):
                **if sv.should_stop():** 
                  print("\n break \n")
                  break
                for step in tqdm(range(model.num_batch), total = model.num_batch, ncols=70, leave=False, unit='b'):
                    sess.run(model.train_op)
                    if step % Params.save_steps == 0:
                        print("\n Global step = " + str(mode.global_step) + "\n")
                        gs = sess.run(model.global_step)
                        sv.saver.save(sess, Params.logdir + '/model_epoch_%d_step_%d'%(gs//model.num_batch, gs%model.num_batch))
                        sample = np.random.choice(dev_ind, Params.batch_size)
                        feed_dict = {data: devdata[i][sample] for i,data in enumerate(model.data)}
                        index, dev_loss = sess.run([model.output_index, model.mean_loss], feed_dict = feed_dict)
                        F1, EM = 0.0, 0.0
                        print("\n before batch")
                        for batch in range(Params.batch_size):
                            print("\n inside batch")
                            f1, em = f1_and_EM(index[batch], devdata[8][sample][batch], devdata[0][sample][batch], dict_)
                            F1 += f1
                            EM += em
                        print("\n after batch")
                         
                        F1 /= float(Params.batch_size)
                        EM /= float(Params.batch_size)
                        sess.run(model.metric_assign,{model.F1_placeholder: F1, model.EM_placeholder: EM, model.dev_loss_placeholder: dev_loss})
                        print("\nDev_loss: {}\nDev_Exact_match: {}\nDev_F1_score: {}".format(dev_loss,EM,F1))

the error : TypeError: An op outside of the function building code is being passed a "Graph" tensor. It is possible to have Graph tensors leak out of the function building context by including a tf.init_scope in your function building code. For example, the following function will fail: @tf.function def has_init_scope(): my_constant = tf.constant(1.) with tf.init_scope(): added = my_constant * 2 The graph tensor has name: global_step:0

khaledmustafa91 avatar Mar 31 '20 12:03 khaledmustafa91