openvino icon indicating copy to clipboard operation
openvino copied to clipboard

[Performance]: inference takes too long on simple tasks

Open xueyingxin opened this issue 1 year ago • 1 comments

OpenVINO Version

2021.2.1.0

Operating System

Windows System

Device used for inference

CPU

OpenVINO installation

Build from source

Programming Language

C++

Hardware Architecture

x86 (64 bits)

Model used

ssd

Model quantization

Yes

Target Platform

No response

Performance issue description

Normally a face detection task takes about 1ms on GPU and <5ms on CPU. But sometimes, the inference time increased to >100ms on CPU. The time computing only includes the api: InferRequest.infer(). Is this an known issue?

Step-by-step reproduction

No response

Issue submission checklist

  • [X] I'm reporting a performance issue. It's not a question.
  • [X] I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • [X] There is reproducer code and related data files such as images, videos, models, etc.

xueyingxin avatar Aug 28 '24 07:08 xueyingxin

Hi, thanks for reaching out. Do you see the same issue in recent version of OpenVINO like 2024.3? If so, please let us know a few key information on the model used, OS, and the CPU platform. Thanks!

wenjiew avatar Aug 30 '24 07:08 wenjiew

This issue will be closed in a week because of 9 months of no activity.

github-actions[bot] avatar Jun 04 '25 00:06 github-actions[bot]

This issue was closed because it has been stalled for 9 months with no activity.

github-actions[bot] avatar Jun 12 '25 00:06 github-actions[bot]