mmdeploy icon indicating copy to clipboard operation
mmdeploy copied to clipboard

Collection of Inference Latency Related Issues | 模型推理速度相关问题收集表

Open RunningLeon opened this issue 3 years ago • 0 comments

Purpose

This issue is pinned to collect any question that is related to model inference latency. You could follow the template to post your issue here.

Pre-info

mmdeploy supports test inference latency of backend models. You could refer to how_to_measure_performance_of_models. If you want to test in your own code, please exclude pre- and post-processings and skip first N times inference when computing the latency/fps.

Template

Issue Description

Please describe what's wrong in here.

Environments

Please run python tools/check_env.py and post env info here.

Place holder

Run Scripts

Please post here what script you run to get the backend model.

Place holder

Steps to Reproduce

Please post here how you test the inference latency exactly.

Place holder

RunningLeon avatar Jun 22 '22 03:06 RunningLeon