openvino-inference-engine topic
List
openvino-inference-engine repositories
yolov5_export_cpu
35
Stars
9
Forks
Watchers
Exporting YOLOv5 for CPU inference with ONNX and OpenVINO
embeddedllm
43
Stars
1
Forks
43
Watchers
EmbeddedLLM: API server for Embedded Device Deployment. Currently support CUDA/OpenVINO/IpexLLM/DirectML/CPU