InsightFace-REST icon indicating copy to clipboard operation
InsightFace-REST copied to clipboard

InsightFace REST API for easy deployment of face recognition services with TensorRT in Docker.

Results 43 InsightFace-REST issues
Sort by recently updated
recently updated
newest added

Hi, @SthPhoenix Do you have any benchmark result for demo_client for 2080 RTX or similar ? what would be the 1000 foto per second hw config ? Best

What is the best for detection (both accuracy and good landmark )and recognition model if there is no speed concern ? It could be run slower but produce highest quality...

Use master branch to do testing, neither trt nor cpu version work. After docker image bootup, can not access http://localhost:18081. This is the testing environment: * Ubuntu 20.04 * NVIDIA-SMI...

Hi Thanks for sharing this project. I download scrfd model from [link](https://drive.google.com/file/d/1v9nhtPWMLSedueeL6c3nJEoIFlSNSCvh/view) and try to convert it to tensorrt model with /src/converters/modules/converters/onnx_to_trt.py script. I use a custom convert.py script for...

Hi, can't seem to get onnx scrfd inference working with batch sizes greater than 1. It's failing during the prepare phase. Here is the code I'm running: ```model = onnx.load(onnx_path)...

ERROR: Could not build wheels for onnx, which is required to install pyproject.toml-based projects CAn you pls help me in fixing this issue

Were you able to run mxnet models with Triton Inference Server?

Hi, I am trying to convert SCFRD from Pytorch to TensorRT and run inference, and I am using this repo as reference. I am looking at https://github.com/SthPhoenix/InsightFace-REST/issues/37 as a similar,...