RaspberryPi_WebRTC
RaspberryPi_WebRTC copied to clipboard
Archive lower latency
This is more of my personal requirement, your work on this is already great and I'd be more happy if you could help me with this:
I'm planning on using this for a robotics competition, where this is served as a "first person perspective" view for robot driver. Which you can already tell that extreme low-latency is crucial.
I have this quite basic setup:
Raspberry Pi -> Router via Wi-Fi -> Windows Machine connected to Router via Ethernet.
It also uses WHEP as the Router won't have internet connection.
However the delay with current setup introduces noticeable delay, which severely impacts driving ability.
I have seen this demo from another project (the video's description has setup details), which uses H264 on RPi 2/3/Zero, which I'm using Zero2. The latency on that project is impressive, while I'm unsure that's because it's running at 640x480 or their program optimization, but if possible to reduce the latency to be as same as that would be greats.
Also if possible I can discuss further with you if you can leave me an email. Thanks a lot
My email is listed on my GitHub profile if you'd like to discuss in private.
Latency is also something I’m very concerned about. Resolution does matter, but every nodes in the webrtc video pipeline can bring latency in different ways. The quality of the router can have a significant impact. For example, my webrtc stream from a pi 5 using the eero router as a wifi ap gives me 250ms latency. But it increased to 300~350ms while I was watching a youtube video on chrome. I also tested connecting the pi 5 to my phone via tethering, and watched the stream directly on the same phone. Ideally, since the packets don’t go through any other routers, I expected the best latency, but instead, I got the worst result ~500 ms....
Another key factor is the performance on the client side, whether it supports hardware decoding? I’ve watched the same video source on both my phone and PC at the same time and still noticed different latencies. As far as I know, latency also differs across browser implementations. For example, Safari is faster than Chrome because it uses a smaller decoder buffer. Also, higer resolutions mean decoder needs to take longer time,haven’t measured how long hardware decoding takes at different resolutions on the pi zero 2w. Anyway, pi zero 2w here I tested is ~250ms at 720p, for your reference.
In this project, I’m using just two buffers for capturing. I believe is already the minimum, since using only one buffer can sometimes be unstable. This also means that the inherent capture latency at 30 fps is about 66.6 ms. Furthermore, the pipeline is designed using dma zero copy for the hardware encoder, When you're already minimizing capture buffers and network interference, the next most effective path might be improving hardware performance. So for cases like robotics control, it might be more efficient to focus on streamlining the entire hardware chain than squeezing tiny gains from encoding settings alone.
Btw, ~130ms using a 240p/60fps camera with h264 hw encoder is the lowest latency I’ve achieved on a pi 3.
For hardware performance I haven't managed to evaluate RPi Zero 2W's performance when streaming on 720p60 as it's what I'm currently using. I'm assuming using hardware encoder would increase efficiency and reduce overheating. But in case RPi Zero 2W cannot do what I'm looking for, does this program useable from other SBCs ? AFAIK this project uses libcamera0.5 now, but it's seems like the libcamera used in RPiOS are custom-made by RPi Foundation
Also just asking for your opinions, does using GStreamer improve latency ? I do see it has tons of params that I can tweak with
What's the lowest latency you're aiming for?
I just released the latest pi-webrtc v1.1.2 which reduces latency by 10ms and lowers CPU usage by 3% on my pi zero 2w.
The hardware encoder mainly reduces CPU usage and encoding time. The heat is just transferred to the encoder chip, it doesn't disappear. I heard some users build this project themselves to run it on Rockchip boards in software mode. Ubuntu can install libcamera as well, but I guess they access the camera using v4l2.
Gstreamer rtp-over-udp definitely faster than webrtc. If you'd like to watch the stream in a browser, you'll still need to use gstreamer webrtc plugin. If not, that not a big deal.
I can't tell the number, but it should be as low as game streaming like parsec or moonlight + sunshine can do, or GFN and other game streaming services.
For me gstreamer is somethings that I knew before but never have a chance to use it. But I'll try to implement it next week.
I'm trying to get GStreamer up, ~~but I'm running into numerous error and~~ it's great if you could share your working setup if you have it. On RPiOS:
b4iterdev@s4v-cam1:~ $ gst-launch-1.0 libcamerasrc ! capsfilter caps=video/x-raw,width=1280,height=720,format=YUY2,colorimetry=bt709,interlace-mode=progressive ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1" ! 'video/x-h264,level=(string)4' ! h264parse ! rtph264pay ! udpsink host=192.168.0.101 port=5000
on macOS:
gst-launch-1.0 -v udpsrc port=5000 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! osxvideosink sync=false
~~The biggest problem is that there is NOTHING produced on output screen, just a green screen and nothing else, I first suspect that it's firewall problem or something and I tried disable firewall but seems no luck on that~~ However, the latency is just as same as WebRTC, surprisingly. Also just quite-perfect setup (Dedicated Wi-Fi router + not having too many Wi-Fi AP in the environment), the latency I archived is as less as 200ms on 720p60 as seem at is blurry yet worthless image:
Moveover, I seen GStreamer has sync=false which would not sync image data using clock AFAIK and it reduce latency at the cost of image tearing. Does this happens with WebRTC ? or it's implemented by browser ?
WebRTC has a FEC mechanisum to recover the missing packets, so you won't see tearing frames in the stream usually. But it also mean the latency can fluctuate, WebRTC might lower the frame rate to compensate, the low fps lead to high latency, so it's a trade-off.
If low latnceny is cruial for you, You could also check out drone's analog FPV modules. They’re used in drones and have really low latency, like 50 to 100 ms by transmitting uncompressed image. In comparison, DJI’s OcuSync digital module delivers encoded streams with latency around 100 to 150 ms. But they all over RF not internet.
I just recently found OpenIPC, and wfb-ng. Looks really promising that they only uses Wi-Fi raw to transmit