homebridge-camera-ffmpeg-ufv icon indicating copy to clipboard operation
homebridge-camera-ffmpeg-ufv copied to clipboard

Possible ffmpeg optimizations

Open shnhrrsn opened this issue 6 years ago • 13 comments

The way the ffmpeg command is currently setup, it’s transcoding the video due to the vcodec/vf params:

let ffmpegCommand = this.ffmpegSource + ' -threads 0 -vcodec '+vcodec+' -an -pix_fmt yuv420p -r '+
fps +' -f rawvideo -tune zerolatency -vf scale='+ width +':'+ height +' -b:v '+ bitrate +'k -bufsize '+
 bitrate +'k -payload_type 99 -ssrc '+ videoSsrc +' -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params '+
 videoKey.toString('base64')+' srtp://'+targetAddress+':'+targetVideoPort+'?rtcpport='+targetVideoPort+
 '&localrtcpport='+targetVideoPort+'&pkt_size=1378';

Since the UniFi cams already broadcast in h264/aac the only real reason to transcode is if there‘s a need for the scaling/bitrate adjustments, and I’m unsure if there is? I’ve dropped the extra params as follows and haven’t experienced any negative side effects:

let ffmpegCommand = this.ffmpegSource + ' -y -threads 0 -vcodec copy -an -f rawvideo -tune zerolatency -bufsize '+
 bitrate +'k -payload_type 99 -ssrc '+ videoSsrc +' -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params '+
 videoKey.toString('base64')+' srtp://'+targetAddress+':'+targetVideoPort+'?rtcpport='+targetVideoPort+
 '&localrtcpport='+targetVideoPort+'&pkt_size=1378';

Using the above args, I’ve been able to reduce connect time/latency to a little less than 5s (previously on my hardware it was anywhere from 20-30s), and obviously the CPU usage dropped substantially since it’s essentially just transmuxing now (or whatever the appropriate term is when converting stream types).

I have a fair amount of experience with ffmpeg (unfortunately not when it comes to streams, so a bit of research for me to do) and I’m working to get latency down to ~1s so it’s on par with the UniFi Video mobile app.

Before I go too far down this rabbit hole are there are any unintended drawbacks of doing this that I’ve overlooked? If not, I’ll open a PR and share whatever progress I make there.

shnhrrsn avatar Jan 14 '19 01:01 shnhrrsn

I've been testing this for a while, and it works pretty well with my UVC and G3 cameras and my all-current iOS devices. I don't have any older or newer cameras to test, or any iOS devices that for some reason might not support the non-transcoded video.

Still smooth sailing on your end as well? Has anyone else tried this?

On my system it cuts video start delay from about 10 seconds to about 5. Seems worth it.

jmbwell avatar Feb 20 '19 16:02 jmbwell

Yeah I haven’t run into any issues, testing on an iPhone XS, iPhone 8 and macOS Mojave.

I've been particularly interested in testing remote since I figure that's where the bitrate stuff comes into play. My HomePod is my hub for relaying, and I haven't run into quality issues except when cell service is super low where I don't think the bitrate would matter regardless.

shnhrrsn avatar Feb 20 '19 21:02 shnhrrsn

It's working really well here too.

I'm on the opposite side of @shnhrrsn though, as I am trying to maximize quality which simply remuxing the video helps achieve.

It also saves 30w every time I stream over homekit.

Botts85 avatar Feb 25 '19 21:02 Botts85

@shnhrrsn I've been testing this out and it works great. I just run the Homebridge server on my iMac and streams generally start within 3-5 seconds. < 1s would be incredible, is possible!

rdougan avatar Mar 13 '19 06:03 rdougan

I should add that I have 3 G3 Flex and 1 G3 Micro cameras.

rdougan avatar Mar 13 '19 06:03 rdougan

Glad to hear this has been working out — I've opened a draft PR in #37 to start tracking this work.

Going to try to find some time this weekend to see if I can get it any quicker.

shnhrrsn avatar Mar 13 '19 23:03 shnhrrsn

I have very little experience with FFMpeg but one thought...

I've never been happy with a) how long it takes to open the Unifi app and view a camera and b) how slow the FFMpeg conversion is with HomeKit (although this helps significantly). I ended up making iOS, macOS and tvOS apps that use each camera /snapshot.jpeg and simply reloads that once a second.

Would it be possible to use FFMpeg to make a live stream from the snapshots, rather than using the RTSP live stream? Obviously, there would be no audio - but it would, in theory, be instant.

rdougan avatar Mar 13 '19 23:03 rdougan

It’s definitely possible but I’d be curious to know how much faster it actually is. I suspect more time is being spent launching ffmpeg than connecting to the RTSP feed so that overhead would still exist.

shnhrrsn avatar Mar 14 '19 03:03 shnhrrsn

I done a little testing and you are right, the initial launch of FFmpeg is the key problem with speed.

Perhaps it would be possible to just have FFmpeg constantly running in the background (where feasible)...

rdougan avatar Mar 18 '19 18:03 rdougan

Just added this to my setup. My Homebridge is running on an rPi 2 with 4 cams and this has helped tremendously. CPU usage is down 15-20% when viewing a stream compared to the out-of-the-box install of the ufv.js file.

ukypayne avatar Jun 07 '19 16:06 ukypayne

$ ffmpeg -rtsp_transport http -re -i rtsp://192.168.1.163:7447/5ce8a70ecf0416ebdcd3743e_0?apiKey= -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 299k -bufsize 299k -payload_type 99 -ssrc 2206678 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params 6BLOAyn9DFy9fWjNW9KIV3I7aZM3xCAlwypavhBk srtp://192.168.1.214:60489?rtcpport=60489&localrtcpport=60489&pkt_size=1378
[1] 5703
[2] 5704
matt@homebridge:~$ ffmpeg version git-2019-07-18-ab4795a Copyright (c) 2000-2019 the FFmpeg developers
  built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.11) 20160609
  configuration: --prefix=/home/matt/ffmpeg_build --extra-cflags=-I/home/matt/ffmpeg_build/include --extra-ldflags=-L/home/matt/ffmpeg_build/lib --bindir=/home/matt/bin --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-nonfree
  libavutil      56. 30.100 / 56. 30.100
  libavcodec     58. 53.101 / 58. 53.101
  libavformat    58. 28.102 / 58. 28.102
  libavdevice    58.  7.100 / 58.  7.100
  libavfilter     7. 56.101 /  7. 56.101
  libswscale      5.  4.101 /  5.  4.101
  libswresample   3.  4.100 /  3.  4.100
  libpostproc    55.  4.100 / 55.  4.100
Unrecognized option 'tune'.
Error splitting the argument list: Option not found

It looks like the -tune parameter is private to specific ffmpeg encoders, and since we're using the copy encoder, my version of ffmpeg will not allow -tune zerolatency. I'll open a PR with this change.

mdshw5 avatar Jul 22 '19 18:07 mdshw5

Nevermind. I installed the wrong branch from @shnhrrsn's PR. I'll give the "transmuxing" branch a shot and report back.

mdshw5 avatar Jul 22 '19 18:07 mdshw5

I just wanted to say thanks for this optimisation. i am running this plugin on an rPI 4 that host multiple virtual machines thanks to ESXi on Arm. Without the optimisation i was at almost 100% for a single stream with 2 cpu cores assigned. After changing the code i dropped to max 3% which is incredible also because image quality went up.

extric99 avatar Feb 16 '21 11:02 extric99