In some cases it may be desirable to feed RTSP (Real Time Streaming Protocol, i.e. IP Camera) output to have compatibility with existing software stacks/etc. over Ethernet. To provide instant integration sort of compatibility, particularly if working with closed-source tools (or hard to modify tools) which already work with RTSP-compatible inputs.
This becomes increasingly powerful when the RTSP Output is a node in the pipeline. As instead of the whole video feed being output as a RTSP video stream, any video stream in the pipeline can be output. Take for using an object detector to guide digital PTZ (https://github.com/luxonis/depthai/issues/135), and then outputting this pan/tilt/zoomed stream directly out RTSP.
This way, the RTSP output would appear to a computer, smart phone, YouTube, etc. as if someone is actually just moving a camera around.
Move to the how:
Leverage live555 and the Gen2 Pipeline Builder (#136) to implement an RTSP output node over Ethernet to work with POE-capable DepthAI devices like the BW2098POE and other future DepthAI hardware designs (such as OAK-1-POE and OAK-D-POE) based on the Ethernet-capable BW2099 module.
Move to the what:
Implement a RTSP output node in the Gen2 Pipeline Builder (https://github.com/luxonis/depthai/issues/136). This will allow any node which produces video to be output over Ethernet.
The zed rtsp examples might be worth reviewing, even though they will work in fundamentally different ways. They setup a meta channel for sending tracking information along with other annotations (like human pose, etc). having this in a node with configurable pass-through for other stuff happening on the module will be excellent.
I like the idea. Before this is ready, can you give some help about how to feed VideoEncoder result into rtsp stream?
I tried to do it with gstreamer pipeline
Looking. I think we have an example (https://github.com/luxonis/depthai-experiments/tree/master/rtsp-streaming) but I don't think it's been updated to work with POE yet. So I'm giving it a shot first and will update if it can work with POE.
I confirmed that that is Gen1, so would need to be updated. Looking at WebRTC example (https://github.com/luxonis/depthai-experiments/tree/master/gen2-webrtc-streaming) now.
@Luxonis-Brandon Thanks for the quick actions! I looket at https://github.com/luxonis/depthai-experiments/tree/master/rtsp-streaming before posting here. IIUC, that example get frames from depthai and use x264 to encode the video.
I am trying to see whether there is a way to use depthai.VideoEncoder's result, streaming it to RTSP directly to avoid encoding the frames using CPU?