Results 23 repositories owned by Triton Inference Server

server

7.5k
Stars
1.4k
Forks
113
Watchers

The Triton Inference Server provides an optimized cloud and edge inferencing solution.

client

455
Stars
218
Forks
Watchers

Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.

model_analyzer

386
Stars
73
Forks
Watchers

Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.

python_backend

446
Stars
185
Forks
Watchers

Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.

dali_backend

117
Stars
27
Forks
Watchers

The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.

model_navigator

160
Stars
24
Forks
Watchers

Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.

backend

238
Stars
76
Forks
Watchers

Common source, scripts and utilities for creating Triton backends.

common

54
Stars
72
Forks
Watchers

Common source, scripts and utilities shared across all Triton repositories.

core

79
Stars
85
Forks
Watchers

The core library and APIs implementing the Triton Inference Server.