Functionality with Tensorflow Serving
Do you have any intuition as to how this model could be served using Tensorflow Serving? The part I'm struggling with relates to converting the incoming serialized data (that was encoded as a sentence, say "That movie was the best one I have ever seen.") back into a regular string to be passed into getSentenceMatrix(), to then be passed through the network.
All examples of Serving in action are using text images, the file I am (sort of) replicating is: https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/inception_saved_model.py.
In particular this part:
# Input transformation.
serialized_tf_example = tf.placeholder(tf.string, name='tf_example')
feature_configs = {'image/encoded': tf.FixedLenFeature(shape=[], dtype=tf.string),
}
tf_example = tf.parse_example(serialized_tf_example, feature_configs)
jpegs = tf_example['image/encoded']
images = tf.map_fn(preprocess_image, jpegs, dtype=tf.float32)
# ^ Essentially this step but relating to strings instead of an image (.jpeg)
Any help is much appreciated! Thank You!!
Hey man, I've definitely meaning to learn more about serving pretrained models but unfortunately, I don't have as much experience with it and so I'm not sure about the answer to your question.
I'll keep this issue open in case anyone else knows and I'll let you know if I find anything once I learn more about it LOL.
(Btw, I do have some code where I show how to load a pretrained model into a Flask server if you'd like to see that https://github.com/adeshpande3/Chatbot-Flask-Server. Doesn't use Tensorflow Serving per se, but it could shed light on some other approaches you could take)
hi i would like to know more if you success in deploy this model with tensorflow serving