serving
serving copied to clipboard
Docker Container for Support between tf-io and tf-serving.
Using the tf.io ops in the tf.serving ecosystem would be a large development convenience and likely decrease inference latency. Can there be an official docker build or documentation to integrate tensorflow-io with tensorflow-serving?
Related issue on tensorflow/io issue #414
It's a reasonable request. For the short term, you can build ops into modelserver using this guide. However, as you noted, the dev experience would be much neater if you could just download a docker image with the op already built in. We're looking into how we want to support custom ops internally. 2 questions: Any particular ops in tf.io? or all? And why do you expect decreased inference latency depending on whether you link it into the model server vs. we release a package with it linked in?
@unclepeddy there wouldn't be a decreased inference latency via linking with the model server. Prior to model server custom ops, we would parse files into protobuf and grpc request tf-serving (quite inefficient). This would be mainly a dev experience improvement.
Ideally, all tf.io ops could be supported. My main interest is in the dicom operations.
I see - so in order to have officially support tf.io ops we need to get a better understanding of their policies (I've asked a question in the issue you opened in their repo).
To parallelize the work though, I can try to see if we would run into any technical challenges if we were to do this (for ex. integrating tf.text ops into serving has proven to be fairly challenging because of issues we run into when linking all their dependencies statically). Could you provide me with a test model that uses tf.io ops and example requests?
@unclepeddy yes I can send you a small test model and thanks for the help. I'll send it to your email [email protected] unless there is another email you'd like for me to use.
That works. Please ensure to include example requests as well. Thanks @Ouwen !
@marcelolerendegui
It's a reasonable request. For the short term, you can build ops into modelserver using this guide.
Hi @unclepeddy. Thanks for the documentation referral. Unfortunately, despite the guidance I have not been able to successfully include tf-io in a custom model server build. I think the problems I've had are related to the particular way tf-io is built (as described in their README). If you have some time, could you look into how this can be accomplished and report back? I'm sure it would be much appreciated by all.
Cheers.
Is there any update to this? Has anyone managed to create a tensorflow-serving dockerfile with tfio support?
@Ouwen,
Are you still looking for a resolution? We are planning on prioritising the issues based on the community interests. As a workaround you can follow TF Serving custom Ops guide to create custom TF Serving with tensorflow-io support. Thanks.
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.
Why would you "modularize" components but then not support their ops in tensorflow-serving?