tensorrt-inference topic

List tensorrt-inference repositories

TensorRT-YOLOv8-ByteTrack

127
Stars
15
Forks
Watchers

An object tracking project with YOLOv8 and ByteTrack, speed up by C++ and TensorRT.

nanosam-cpp

38
Stars
3
Forks
Watchers

C++ TensorRT Implementation of NanoSAM

tensorrt_smoke

30
Stars
5
Forks
Watchers

3d object detection model smoke c++ inference code

bevdet-tensorrt-cpp

229
Stars
36
Forks
Watchers

BEVDet implemented by TensorRT, C++; Achieving real-time performance on Orin

yolo8-segmentation-deploy

36
Stars
11
Forks
Watchers

Production-ready YOLO8 Segmentation deployment with TensorRT and ONNX support for CPU/GPU, including AI model integration guidance for Unitlab Annotate.

ComfyUI-Depth-Anything-Tensorrt

77
Stars
6
Forks
Watchers

ComfyUI Depth Anything (v1/v2) Tensorrt Custom Node (up to 14x faster)

Yolo-TensorRT

15
Stars
6
Forks
Watchers

Convert yolo models to ONNX, TensorRT add NMSBatched.

tensorrt-experiment

16
Stars
1
Forks
Watchers

Base on tensorrt version 8.2.4, compare inference speed for different tensorrt api.