cnn-text-classification-tf icon indicating copy to clipboard operation
cnn-text-classification-tf copied to clipboard

tf.train.import_meta_graph

Open zhuantouer opened this issue 8 years ago • 1 comments

Hi, @dennybritz when I use tf.train.import_meta_graph to eval the trained model for new samples, it is very slow to load the graph, do you known why, is there other method to load the trained model, the official tutorial use tf.train.saver(), but it is very complicated, you need to redefine the variables, and load it again. Using load meta graph is very convenient, but it is so slow, do you have and suggestions? Thanks

zhuantouer avatar Nov 15 '16 12:11 zhuantouer

yea import metagraph requires loading of .meta file and then restoring model file. It is useful if you want to continue your training from that checkpoint. But for inference/ testing purpose this will be slow. Instead you can use import_graph_def on a frozen graph. here is the example:

graph = tf.Graph()
with graph.as_default():
    with open('frozen_graph44000.pb', 'r') as myfile:
        print("loading graphDef in memory")
        bindata = myfile.read()
    output_graph_def = tf.GraphDef()
    output_graph_def.ParseFromString(bindata)
    _ = tf.import_graph_def(output_graph_def, name="")
    sess = tf.Session(config=session_conf)
    with sess.as_default():
        #do you inference stuff here

ref: https://www.tensorflow.org/versions/r0.10/how_tos/tool_developers/index.html#freezing

dhwajraj avatar Dec 07 '16 08:12 dhwajraj