How to get prediction request and response logs in TensorFlowServing?
just wonder know how can get log output, i use the docker environment to run tfserving
@hurun, we have a similar feature request #2069 filled. Requesting you to close this issue, follow and +1 similar issue for updates. Thank you!
@hurun, we have a similar feature request #2069 filled. Requesting you to close this issue, follow and +1 similar issue for updates. Thank you!
I think https://github.com/tensorflow/serving/issues/2069 is not similar what i supposed
Generally we can use docker logs <container name> to check logs of the container. Unfortunately tensorflow_model_server --port=8500 --rest_api_port=8501 --model_name=${MODEL_NAME} --model_base_path=${MODEL_BASE_PATH}/${MODEL_NAME} doesn't provide request and response logs.
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.
This issue was closed due to lack of activity after being marked stale for past 7 days.