gpu-rest-engine
gpu-rest-engine copied to clipboard
TensorRT 6 for detection + classification
Using gpu-rest-engine, i created a service that could detect, classify -or- detect and then classify. First step is to do detection with SSD using of a single class (other than background), and then take crops from the bounding boxes and run through a googlenet classiifer.
Would I be able to do the same with TRT inference server? or would I need to use https://github.com/NVIDIA/tensorrt-laboratory?
Thanks!! the gpu-rest-engine has worked very well, but wanting to upgrade to TRT 6 and having issues.
Hello @mkh-github. You should ask them directly on their GitHub. TRTIS is open source too: https://github.com/NVIDIA/tensorrt-inference-server
They will be in a better position to answer you.
TRTIS has an ensemble API which allows you to chain models together.