save model for inference later
thanks for the example codes. Experimented with Text Classification for character_rnn (https://github.com/baidu-research/tensorflow-allreduce/blob/master/tensorflow/examples/learn/text_classification_character_rnn.pyy).
How can i write a serving_input_fn for it ? I want to save and restore this model
extended the code to save but getting error, please help
from tensorflow.contrib.learn.python.learn.utils import input_fn_utils
feature_spec = {"feature":tf.FixedLenFeature([100],tf.int64)}
serving_input_fn = input_fn_utils.build_parsing_serving_input_fn(feature_spec)
and than
classifier.export_savedmodel(export_dir_base='model', serving_input_receiver_fn=serving_input_fn)
and getting this error
TypeError: Failed to convert object of type <class 'dict'> to Tensor. Contents: {'feature': <tf.Tensor 'ParseExample/ParseExample:0' shape=(?, 100) dtype=int64>}. Consider casting elements to a supported type.