serving icon indicating copy to clipboard operation
serving copied to clipboard

A flexible, high-performance serving system for machine learning models

Results 184 serving issues
Sort by recently updated
recently updated
newest added

### Describe the problem the feature is intended to solve TensorFlow's pluggable device architecture offers a plugin mechanism for registering devices with TensorFlow without the need to make changes in...

type:feature
stat:awaiting tensorflower

## Bug Report If this is a bug report, please fill out the following form in full: ### System information - **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: -...

stat:awaiting tensorflower
type:bug

I've been having a hell of a time trying to figure-out how to serve multiple models using a yaml configuration file for K8s. I can run directly in Bash using...

stat:awaiting tensorflower
type:support

### Describe the problem the feature is intended to solve When building tfx r2.8-rc0 with mkl support, I see the following issue: ``` ERROR: /root/.cache/bazel/_bazel_root/c206fe4b7a49887ed31d86472abc6776/external/org_tensorflow/tensorflow/core/common_runtime/BUILD:1739:11: Couldn't build file external/org_tensorflow/tensorflow/core/common_runtime/_objs/threadpool_device/threadpool_device.o: C++...

type:build/install
stat:awaiting response

Can we reload the batch config after the server starts ? I see model config has such function. _Originally posted by @mikezhang95 in https://github.com/tensorflow/serving/issues/344#issuecomment-631395316_

type:feature
stat:awaiting tensorflower

Steps: 1) Download [st5-large](https://tfhub.dev/google/sentence-t5/st5-large/1) 2) Load it from a tensorflow-serving docker container, latest-gpu or nightly-gpu 3) Send an inference request to the model 4) Observe error log: ``` [evhttp_server.cc :...

stat:awaiting tensorflower
type:bug

Please go to Stack Overflow for help and support: https://stackoverflow.com/questions/tagged/tensorflow-serving If you open a GitHub issue, here is our policy: 1. It must be a bug, a feature request, or...

stat:awaiting tensorflower
type:bug

## Feature Request If this is a feature request, please fill out the following form in full: ### Describe the problem the feature is intended to solve I'm always frustrated...

type:feature
stat:awaiting tensorflower

This PR is dependent on https://github.com/tensorflow/serving/pull/1953

cla: yes
ready to pull

When I use TF-Serving with batching options and variable length inputs, I could get errors like `Tensors with name 'example_feature:0' from different tasks have different shapes and padding is turned...

type:docs
stat:awaiting tensorflower