depthai-experiments
depthai-experiments copied to clipboard
Unstable RTSP streaming
The gen2-rtsp-streaming demo runs with a lot of garbage in the result stream, even with the lowest supported resolution (1080p) and 1 fps rate (see attached screenshot).
Does it mean we can't use the raw camera stream when needed? I absolutely agree that it is much better to process the stream internally and deliver only the final results to the network but there are some real life use cases when access to the raw video is required and I'm wondering if it is doable.
I recorded a 10 seconds fragment of the stream by running the following command on the same host where the demo (with the default settings: 1080p @ 10fps) was running:
ffmpeg -i rtsp://localhost:8554/preview -acodec copy -vcodec copy 10fps.mp4 2> 10fps_stderr.txt
The stderr from ffmpeg along with the result mp4 are also attached.
Sorry about the trouble @IgorPrilepov . Asking team here. I don't think I immediately understand your question/need/usecase at first but it may just be my ignorance - so I'm asking the team as there is a high probability the team will understand even if I don't.
Sorry about the trouble @IgorPrilepov . Asking team here. I don't think I immediately understand your question/need/usecase at first but it may just be my ignorance - so I'm asking the team as there is a high probability the team will understand even if I don't.
Really appreciate it! The goal is to get the stream in 4K @ 30 fps over an IP link and I hoped to use existing RTSP demo for that.
Thanks. Are you using a PoE or USB model @IgorPrilepov ?
And as a heads up most of the team on this are in Europe - so most likely the answers here will come tomorrow.
The logs and the screenshot are from the POE model but I've started from the USB model and observed the same behavior. Thanks.
Got it. So one thing I don't know is if encoding is being done on the camera in these examples or on the host. As the camera is capable of encoding, but in some cases it is desired by the end-customer for this not to be done.
And if the frames are not being encoded on-camera, than there is no way for the unencoded frames to make it over Ethernet at any reasonable framerate as unencoded 4k is about 373MB/s, or about 4x what Ethernet could theoretically handle (at least).
And then similarly for USB, if the camera is not doing on-device encoding, it may be worth checking to make sure a USB3-capable (5gbps or higher) cable is being used, and that the device is showing up as USB3 when running. As USB2 is actually even slower than Ethernet so would have an even lower framerate - a framerate which FFMPEG/etc. might not be understanding by default (I'm not sure) or is sufficiently variable that it is throwing ffmpeg off (also not sure - just guessing).
@Erol444 - did we get live documentation on how to physically tell if a USB cable has USB3 capability in the A-side of the port? That would be good to get live along with instructions to see programmatically (or from terminal prints) at what USB speed the DepthAI device is connected. Either way, please share here when you're online.
I see, so this Ethernet is only 100Mbps. It definitely explains the issue.Will try to play with USB-C cable tomorrow. Thank you very much for the quick response!
@Luxonis-Brandon, I'm not in the office today but will test the USB3 topic on Thursday. Do you have any recommendation on how to enable encoding on camera side?
Thanks and yes see here @IgorPrilepov for on-device encoding: https://docs.luxonis.com/projects/api/en/latest/samples/VideoEncoder/rgb_encoding/
I don't know enough about ffmpeg/Gstreamer/etc. to say how to re-stream these form the host. But this shows how to do encoding on the camera and then just save to the disk on the host.
We can probably write an example to re-stream this over the network/etc. if that's of interest.
We also have this example app which streams MJPEG directly from the camera to any web browser: https://photos.app.goo.gl/s5tJh1V56qZGq8vNA https://github.com/luxonis/depthai-poe-webapp
Thoughts?
Thanks, Brandon
It looks pretty promising. Let me look into all these examples and get back to you. Thanks Igor
Thanks and sounds good! We can also implement an RTSP node directly in the PoE cameras as well. Shoot me an email if that becomes of interest.
Here is an update: I was able to record video using H265 encoder (the first link in your message) and then play it back without any defects which confirms that encoded 4K/30fps stream should fit into the 100mbps link. However, looking into the RTSP demo I found that it also uses the same encoder settings :(
It looks like the open source RTSP server itself is the bottleneck so if you could add it into the PoE camera that should work. But let me discuss this and other options internally and get back to you. Thanks Igor
Thanks @IgorPrilepov ! Sounds good and thanks for the details here.
Is there any updates on this issue? I am facing the same problem when trying to stream using RTSP with the OAK-D PoE.
Any updates on this issue? I'm just trying to find a minimal way to stream H.265 video from my OAK-D PoE, and I haven't found anything that works out of the box.
Is there any updates on this? Facing similar issues. Thanks!
Hi @ddreise , some frames are likely being dropped, either on the device side, host side, or during transferring from device to the host (eg. if using POE device, a network issue). Best to first find out where these frames are being dropped; here you could use pipeline_graph to check FPS of the pipeline, and for host-side you could check sequence numbers of frames. If they are always incrementing by 1 that's ok, if there are sometimes sequence numbers missing, that means some frames are dropping, producing these strange visual artifacts. Thanks, Erik
Thanks and sounds good! We can also implement an RTSP node directly in the PoE cameras as well. Shoot me an email if that becomes of interest.
Hi @Luxonis-Brandon I am in similar problem seeking for your help.