depthai-experiments icon indicating copy to clipboard operation
depthai-experiments copied to clipboard

Gaze Estimation Demo Latency?

Open raymondlo84 opened this issue 3 years ago • 10 comments

Is there a reason why I'm getting a 5second delay/latency on the video feed with that demo? Is that a Mac OSX specific issue or it's the pipeline (got too deep?). Thank you.

raymondlo84 avatar Jan 25 '21 19:01 raymondlo84

Hi Raymond, It's not OS specific It's the current implementation that lacks nn results synchronization, we'll be working more on it so that gets implemented

VanDavv avatar Jan 25 '21 20:01 VanDavv

Is that a problem across the entire Gen2 pipeline? I mean all demo got the same latency related issues?

raymondlo84 avatar Jan 25 '21 21:01 raymondlo84

No, there are ways to sync the nn results, here is an example - https://github.com/luxonis/depthai-experiments/tree/master/gen2-nn-sync

VanDavv avatar Jan 25 '21 21:01 VanDavv

Also discussing this latency with the team as well.

Luxonis-Brandon avatar Jan 26 '21 04:01 Luxonis-Brandon

The latency is great with the nn demo! Let me learn this and see if I can apply that to others.

raymondlo84 avatar Jan 28 '21 20:01 raymondlo84

Thanks @raymondlo84 for taking this and applying it to the others! Much appreciated. (So as background, we implemented demos as we were refining the Gen2 system, so you are seeing it exactly correctly - we need to go through and update others to use the improvements/techniques learned from the initial implementations and resultant improvements to the API, or improvements to how it is used (or both).)

Luxonis-Brandon avatar Jan 28 '21 22:01 Luxonis-Brandon

https://github.com/luxonis/depthai-experiments/pull/51 addressed this issue. I will close this issue once the PR is merged.

raymondlo84 avatar Jan 29 '21 20:01 raymondlo84

Thanks @raymondlo84 for taking this and applying it to the others! Much appreciated. (So as background, we implemented demos as we were refining the Gen2 system, so you are seeing it exactly correctly - we need to go through and update others to use the improvements/techniques learned from the initial implementations and resultant improvements to the API, or improvements to how it is used (or both).)

Glad to be able to help.

No, there are ways to sync the nn results, here is an example - https://github.com/luxonis/depthai-experiments/tree/master/gen2-nn-sync

What are the advantages and disadvantages of these two methods proposed in that demo?

raymondlo84 avatar Jan 29 '21 20:01 raymondlo84

I'm actually not sure. @themarpe - could you comment?

Luxonis-Brandon avatar Jan 30 '21 00:01 Luxonis-Brandon

@VanDavv can describe the two implementations used there in more detail. The syncing is usually required when we want to display NN results on a continuous stream which is faster. The added latency to match those two up is at best the inference time of one frame. In gen2 one can "desync" a slow consumer by settings its input to be non blocking. In this case NeuralNetwork.passthrough output can be used to determine on which input frame the inference was made.

The latency issue can also happen if ColorCamera is connected to a slow consumer in a blocking manner. That is something we have to further investigate, to send out most recent frames instead of old ones which got queued up.

themarpe avatar Feb 01 '21 13:02 themarpe