inference-server topic

List inference-server repositories

BMW-YOLOv4-Inference-API-GPU

280
Stars
73
Forks
Watchers

This is a repository for an nocode object detection inference API using the Yolov3 and Yolov4 Darknet framework.

BMW-YOLOv4-Inference-API-CPU

219
Stars
64
Forks
Watchers

This is a repository for an nocode object detection inference API using the Yolov4 and Yolov3 Opencv.

BMW-TensorFlow-Inference-API-CPU

186
Stars
49
Forks
Watchers

This is a repository for an object detection inference API using the Tensorflow framework.

pinferencia

558
Stars
85
Forks
Watchers

Python + Inference - Model Deployment library in Python. Simplest model inference server ever.

ai-serving

145
Stars
31
Forks
Watchers

Serving AI/ML models in the open standard formats PMML and ONNX with both HTTP (REST API) and gRPC endpoints

k3ai

101
Stars
10
Forks
Watchers

K3ai is a lightweight, fully automated, AI infrastructure-in-a-box solution that allows anyone to experiment quickly with Kubeflow pipelines. K3ai is perfect for anything from Edge to laptops.

gpu-rest-engine

421
Stars
94
Forks
Watchers

A REST API for Caffe using Docker and Go

orkhon

145
Stars
5
Forks
Watchers

Orkhon: ML Inference Framework and Server Runtime

fastDeploy

93
Stars
18
Forks
Watchers

Deploy DL/ ML inference pipelines with minimal extra code.

Server

61
Stars
12
Forks
Watchers

A standalone inference server for trained Rubix ML estimators.