serving
serving copied to clipboard
About the size of the compiled Docker image
Please go to Stack Overflow for help and support:
https://stackoverflow.com/questions/tagged/tensorflow-serving
If you open a GitHub issue, here is our policy:
- It must be a bug, a feature request, or a significant problem with documentation (for small docs fixes please send a PR instead).
- The form below must be filled out.
Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.
Feature Request
If this is a feature request, please fill out the following form in full:
Describe the problem the feature is intended to solve
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Describe the solution
A clear and concise description of what you want to happen.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
Add any other context or screenshots about the feature request here.
Bug Report
If this is a bug report, please fill out the following form in full:
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04):ubuntu 18.04
- TensorFlow Serving installed from (source or binary):source
- TensorFlow Serving version:tfserving 2.4.1
Describe the problem
Because we modified the source code, we recompiled it and packaged it as a Docker image. Here are my steps:
-
Make a Devel image. Copy the source code into the container for compilation, and then use Docker Commit to package the container as an image. The size of this mirror image is 8 G.
-
Use the dockerfile.gpu file under /tensorflow_serving/tools/docker and change the base image to the devel image you just generated:
change:
ARG TF_SERVING_BUILD_IMAGE=tensorflow/serving:${TF_SERVING_VERSION}-devel-gpu
to:
ARG TF_SERVING_BUILD_IMAGE=tensorflow/serving:my_new_devel
But the resulting image is 5.35 gigabytes, compared with just 200 megabytes pulled away
docker pull tensorflow/serving
So what's the problem?
Exact Steps to Reproduce
Please include all steps necessary for someone to reproduce this issue on their own machine. If not, skip this section.
Source code / logs
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem.
@arghyaganguly @lilao @ewilderj @kchodorow Any suggestions?
This is my instruction:
sudo docker build -f /home/WorkSpace/gProject/tfserving/tensorflow_serving/tensorflow_serving/tools/docker/Dockerfile.gpu -t tfserving_runntime_2.4.1_official .
No changes have been made to Dockerfile.gpu,But the result is 5.13G
@JCCVW
Could you please confirm if this is still an issue.Thanks
@JCCVW,
The devel GPU image for TF Serving which looks to be about 5GB, of which ~1GB is TF Serving and it's build artifacts, with the rest being dependencies pulled in from apt-get. :latest-devel-gpu
include all source dependencies/toolchain (cuda9/cudnn7) to develop, along with a compiled binary that works on NVIDIA GPUs which leads to larger size of images.
Thank you!
This issue was closed due to lack of activity after being marked stale for past 14 days.