conversationai-models icon indicating copy to clipboard operation
conversationai-models copied to clipboard

Support exporting models

Open iislucas opened this issue 6 years ago • 0 comments

Estimator uses the most recent model by default, see: https://www.tensorflow.org/get_started/checkpoints ; Note that while checkpoints store model weights, the whole graph + weights (aka models) can be restored - this looks like the right abstraction, and may obviate the need for build_parsing_serving_input_receiver_fn, which exports a model that takes TF.Example proto as input.

Something like (Thanks to @dborkan for the pointers!):

feature_spec = {  'sentence': tf.FixedLenFeature(dtype=tf.string, shape=1)}
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
# Note: `estimator` below is an instance of the TF Estimator class.estimator.export_savedmodel(<destination_directory>, serving_input_fn)

This seem to fit naturally into the base_model.py abstraction. To be figured out: what's the right way to specify the appropriate checkpoint to use?

iislucas avatar Jun 29 '18 00:06 iislucas