depthai-core icon indicating copy to clipboard operation
depthai-core copied to clipboard

Low latency with Mono Camera

Open fboris opened this issue 2 years ago • 5 comments

Refer from this document(https://docs.luxonis.com/projects/api/en/latest/tutorials/low-latency/). Mono camera suppose can get image under 120FPS with only 24.5ms time delay. But I test with different combinations: 1280X800@60FPS, 1280X800@30FPS, 640X400@60FPS, 640X400@120FPS time delay is around 100~130ms which is quiet large. The method I used is make camera to look at "real" chronometer(which is different from document). Any possible ways to reduce?

fboris avatar May 07 '22 10:05 fboris

Hi @fboris , I'll let @Erol444 comment on these. @Erol444 please also update the docs with scripts used to achieve these results

themarpe avatar May 07 '22 12:05 themarpe

How to test latency in C++ SDK? Is following snippet correct?

auto mono = device.getOutputQueue(ev)->get<dai::ImgFrame>();
int64_t time_diff = duration_cast<std::chrono::microseconds>
(mono->getTimestamp()- std::chrono::steady_clock::now()).count();

fboris avatar May 09 '22 06:05 fboris

@fboris

Looks good with exception of swapping the getTimestamp and steady_clock::now

auto mono = device.getOutputQueue(ev)->get<dai::ImgFrame>();
int64_t time_diff = duration_cast<std::chrono::microseconds>(std::chrono::steady_clock::now() - mono->getTimestamp()).count();

The above will give you latency (as precise as timesync between host & device is, USB is better than PoE devices in this case), as a positive microseconds integer

themarpe avatar May 09 '22 06:05 themarpe

I can confirm latency is 30ms under 1280X80@30fps(two ov9282)

fboris avatar May 10 '22 02:05 fboris

Hi, I have used a cose similar to the one below to estimate the average latency. Let me know if you have any questions.

import depthai as dai
from depthai_sdk import FPSHandler as FPS

# Create pipeline
pipeline = dai.Pipeline()
pipeline.setXLinkChunkSize(0)

# Define source and output
camRgb = pipeline.create(dai.node.ColorCamera)
camRgb.setFps(30)
camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_1080_P)
camRgb.setIspScale(2,3)

# mono = pipeline.create(dai.node.MonoCamera)
# mono.setFps(60)
# mono.setBoardSocket(dai.CameraBoardSocket.RIGHT)
# mono.setResolution(dai.MonoCameraProperties.SensorResolution.THE_800_P)

videoEnc = pipeline.create(dai.node.VideoEncoder)
videoEnc.setDefaultProfilePreset(30, dai.VideoEncoderProperties.Profile.MJPEG)
camRgb.video.link(videoEnc.input)

xoutVideo = pipeline.create(dai.node.XLinkOut)
xoutVideo.setStreamName("video")
xoutVideo.input.setBlocking(False)
xoutVideo.input.setQueueSize(10)
# camRgb.isp.link(xoutVideo.input)
videoEnc.bitstream.link(xoutVideo.input)

# Connect to device and start pipeline
with dai.Device(pipeline) as device:
    print(device.getUsbSpeed())
    fps = FPS()
    video = device.getOutputQueue(name="video", maxSize=10, blocking=False)

    while True:
        videoIn = video.get()
        diff = dai.Clock.now() - videoIn.getTimestamp()
        print('diff', diff)
        fps.nextIter()
        print(fps.fps())

Thanks, Erik

Erol444 avatar May 12 '22 19:05 Erol444