serving
serving copied to clipboard
Building tensorflow serving with TCMalloc
Feature Request
Describe the problem the feature is intended to solve
Recently I found that building tensorflow serving with tcmalloc and set soft limit can mitigating these kind of memory issue: https://github.com/tensorflow/serving/issues/2142 , https://github.com/tensorflow/serving/issues/1664. Also, I could get slightly better performances.
Here's what I did.
- build
tensorflow_model_server
cc binary with tcmalloc using malloc argument. - launch background thread before server starts using
tcmalloc::MallocExtension::ProcessBackgroundActions
(reference) - Set soft limit, and add tcmalloc_soft_limit argument for it
Describe the solution
How about providing tensorflow serving compiled with tcmalloc? I know I can use jemalloc instead, but jemalloc has so much configurations, while tcmalloc doesn't. It is easy to use and has great performance.
@jeongukjae,
Thank you for filing this feature request. We will discuss this feature implementation internally and update this thread. Thanks.