serve
serve copied to clipboard
Serve, optimize and scale PyTorch models in production
### 📚 The doc issue I didn't see anything relating to this in the doc, so I figured Ill post the question here. I am wondering where data is stored...
### 📚 The doc issue To maximize throughput I am looking to have the client batch multiple frames into a single request before sending to TS. However, I am struggling...
### 🚀 The feature upgrade to PT1.12 ### Motivation, pitch PT 1.12 was released. TS needs to upgrade to PT1.12 ### Alternatives _No response_ ### Additional context _No response_
### 🚀 The feature Torchserve automatically loads and unloads the model on the basis of the request. If I have registered 3 models in torchserve. If one of the models...
### 🐛 Describe the bug I did some image composition ie overlaying 1 image on top of other using cv2 After running this command I get different results on local...
### 🐛 Describe the bug I am using 2 GPU. Torchserve inference returns correct values only for predictions run on cuda:0. ### Error logs ``` x = "text to embed"...
### 🐛 Describe the bug I followed the [HuggingFace example](https://github.com/pytorch/serve/tree/master/examples/Huggingface_Transformers) to deploy the GPT2 model, but a CORS error occurred when the front-end was called. I configured it in `config.properties`...
### 🐛 Describe the bug Our latest doc build is crashing https://github.com/pytorch/serve/runs/7361642275?check_suite_focus=true with this error `ModuleNotFoundError: No module named 'ts_scripts'` @svekars does this look familiar? ### Error logs ``` Run...
### 🚀 The feature ## Motivation To install torchserve today, users have to run `python ts_scripts/dependencies.py` with the optional CUDA arg Ex: `--cuda=cu102` and then install torchserve binaries on top....
## Description In the current implementation, at the time of `docker build`, a Docker image with the latest version of `torchserve` installed is created regardless of the version of the...