serving icon indicating copy to clipboard operation
serving copied to clipboard

Docker Container for Support between tf-io and tf-serving.

Open Ouwen opened this issue 5 years ago • 11 comments

Using the tf.io ops in the tf.serving ecosystem would be a large development convenience and likely decrease inference latency. Can there be an official docker build or documentation to integrate tensorflow-io with tensorflow-serving?

Related issue on tensorflow/io issue #414

Ouwen avatar Aug 08 '19 19:08 Ouwen

It's a reasonable request. For the short term, you can build ops into modelserver using this guide. However, as you noted, the dev experience would be much neater if you could just download a docker image with the op already built in. We're looking into how we want to support custom ops internally. 2 questions: Any particular ops in tf.io? or all? And why do you expect decreased inference latency depending on whether you link it into the model server vs. we release a package with it linked in?

misterpeddy avatar Aug 08 '19 19:08 misterpeddy

@unclepeddy there wouldn't be a decreased inference latency via linking with the model server. Prior to model server custom ops, we would parse files into protobuf and grpc request tf-serving (quite inefficient). This would be mainly a dev experience improvement.

Ideally, all tf.io ops could be supported. My main interest is in the dicom operations.

Ouwen avatar Aug 08 '19 21:08 Ouwen

I see - so in order to have officially support tf.io ops we need to get a better understanding of their policies (I've asked a question in the issue you opened in their repo).

To parallelize the work though, I can try to see if we would run into any technical challenges if we were to do this (for ex. integrating tf.text ops into serving has proven to be fairly challenging because of issues we run into when linking all their dependencies statically). Could you provide me with a test model that uses tf.io ops and example requests?

misterpeddy avatar Aug 09 '19 22:08 misterpeddy

@unclepeddy yes I can send you a small test model and thanks for the help. I'll send it to your email [email protected] unless there is another email you'd like for me to use.

Ouwen avatar Aug 11 '19 16:08 Ouwen

That works. Please ensure to include example requests as well. Thanks @Ouwen !

misterpeddy avatar Aug 12 '19 14:08 misterpeddy

@marcelolerendegui

Ouwen avatar Aug 13 '19 16:08 Ouwen

It's a reasonable request. For the short term, you can build ops into modelserver using this guide.

Hi @unclepeddy. Thanks for the documentation referral. Unfortunately, despite the guidance I have not been able to successfully include tf-io in a custom model server build. I think the problems I've had are related to the particular way tf-io is built (as described in their README). If you have some time, could you look into how this can be accomplished and report back? I'm sure it would be much appreciated by all.

Cheers.

tinder-michaelallman avatar Aug 16 '19 17:08 tinder-michaelallman

Is there any update to this? Has anyone managed to create a tensorflow-serving dockerfile with tfio support?

lminer avatar Jan 15 '21 19:01 lminer

@Ouwen,

Are you still looking for a resolution? We are planning on prioritising the issues based on the community interests. As a workaround you can follow TF Serving custom Ops guide to create custom TF Serving with tensorflow-io support. Thanks.

singhniraj08 avatar May 18 '23 10:05 singhniraj08

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

github-actions[bot] avatar May 26 '23 01:05 github-actions[bot]

Why would you "modularize" components but then not support their ops in tensorflow-serving?

gbildson avatar May 26 '23 19:05 gbildson