homebridge-camera-ffmpeg
homebridge-camera-ffmpeg copied to clipboard
Hardware Acceleration Using VAAPI (Intel GPUs)
Thanks to @mickgiles for the inspiration in #508, I wanted to share what worked for me in newer versions (v3) of homebridge-camera-ffmpeg. If running in docker, remember to pass through your /dev/dri:/dev/dri
devices.
Obviously you'll need a working copy of ffmpeg that has been built with VAAPI (h264_vaapi) support. Lots of resources for compiling your own out there.
ffmpeg.sh Script The script acts a middle man between homebridge and ffmpeg. This is needed to fix a few hardcoded elements in the ffmpeg command, this wouldn't be needed if the plugin gave the additional config control needed. Also #504 is still an issue and should be reopened.
Update the CMD
variable to be the actual path to ffmpeg.
What it does:
-
-pix_fmt yuv420p
is changed to-pix_fmt vaapi
-
-filter:v <value>
is removed from the command when a snapshot is being performed (see #504) -
scale
filters are changed toscale_vaapi
filters
#!/bin/bash
CMD="/home/user/bin/ffmpeg"
SNAPSHOT="FALSE"
while [[ $# > 1 ]]
do
key="$1"
if [[ "${key}" == "-frames:v" ]]; then
SNAPSHOT="TRUE"
fi
case ${key} in
-pix_fmt)
CMD+=" $1 vaapi"
shift
;;
-filter:v)
if [[ "${SNAPSHOT}" == "FALSE" ]]; then
CMD+=" $1"
else
shift
fi
;;
*)
CMD+=" $1"
;;
esac
shift
done
CMD="${CMD//scale=/scale_vaapi=}"
exec $CMD ${!#}
Plugin config:
Update the videoProcessor
value to be the path to your ffmpeg.sh script.
This seems to work with and without videoFilter
, but I have left it in for now as it's used in all VAAPI examples I've seen.
"platform": "Camera-ffmpeg",
"videoProcessor": "/home/user/bin/ffmpeg.sh",
"cameras": [
{
"name": "Front Camera",
"videoConfig": {
"source": "-vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://axis-accc8e6d470a.local/axis-media/media.amp?streamprofile=Media?tcp",
"stillImageSource": "-i http://axis-accc8e6d470a.local/axis-cgi/jpg/image.cgi?resolution=1280x720",
"encoderOptions": "-bf 0",
"vcodec": "h264_vaapi",
"videoFilter": "format=nv12|vaapi,hwupload"
"maxBitrate": 0,
"packetSize": 940
}
}
Config notes:
-
maxBitrate
andpacketSize
are not essential, but gave me much higher quality streams -
encoderOptions
set to-bf 0
is the same as using the traditionalbframes=0
, without this the frame rate is very poor -
If the H264 profile of your camera is baseline (normally the case with lower power, cheaper devices), you might need to add the following parameter into the
source
value to have ffmpeg/vaapi ignore the profile and try to process the stream anyway:-hwaccel_flags allow_profile_mismatch
View your live camera in the Home app, and then run intel_gpu_top
on the host machine to confirm the GPU cores are being used. Success.
Interesting. I’ll have to look into this closer later, but I think I’d be willing to tweak this plugin so your wrapper isn’t needed.
Also, I could probably work to get vaapi support compiled into the ffmpeg-for-homebridge package, if it’s not there currently.
Also, I could probably work to get vaapi support compiled into the ffmpeg-for-homebridge package, if it’s not there currently.
That would be awesome if possible, but I think there are licensing issues with distribution and ffmpeg-for-homebridge distributes static binaries.
Ah, it’s one of those things. I’ll see what I can find out.
Based on my quick looking, it sounds like it should be okay to include: https://trac.ffmpeg.org/wiki/HWAccelIntro#VAAPI
Based on my quick looking, it sounds like it should be okay to include: https://trac.ffmpeg.org/wiki/HWAccelIntro#VAAPI
Great! That being included in ffmpeg-for-homebridge and any tweaks you can make to add additional plugin config flexibility to eliminate the need for the wrapper script would definitely help people looking to improve their HomeKit camera experiences, especially with HSV on the way.
The only trick is that I won’t be able to test myself, as I don’t have any Intel CPUs around, but I can mirror what your script does and it should work.
The only trick is that I won’t be able to test myself, as I don’t have any Intel CPUs around, but I can mirror what your script does and it should work.
I can do the testing for you. I'm thinking config wise all we'd need is:
scaleFilterName
- defaulted to "scale"
pixelFormat
- defaulted to "yuv420p"
stillImageFilter
- defaulted to whatever videoFilter
is set to (same as today), and use this when doing snapshots rather than streaming
I have so many options that I was hoping to avoid adding more. I was considering handling this by switching those values out when the user sets vcodec to h264_vaapi, similar to how the plugin already handles the copy vcodec.
I have so many options that I was hoping to avoid adding more. I was considering handling this by switching those values out when the user sets vcodec to h264_vaapi, similar to how the plugin already handles the copy vcodec.
Or that works :) I just thought these config values would allow total flexibility i.e. somebody might want to use a different hardware acceleration library.
There are a few that work already, h264_videotoolbox and h264_omx come to mind, but neither of those require any special settings.
Yeah, I did wonder how omx was handled. The main thing is the filters, they're named differently in vaapi, maybe hardcode those and pixel format when using h264_vaapi.
I still think stillImageFilter
as a config value would be useful, as the source could be entirely different. Support the none keyword like in videoFilter.
I’ll consider stillImageFilter, though I’m not sure I understand when that would really be needed? If it’s just for this (which it sounds like it works fine without it, so I’d want to look into it more), then I’d be better off triggering it with the vcodec setting.
Hmm I don't actually have an issue with snapshots anymore when I disable that part of the wrapper script - as I don't need anything special in videoFilter
.
I suppose an issue could arise if you had source
set to your HW accelerated RTSP stream, stillImageSource
set to something like a JPEG/MJPEG capture, but you needed some HW acceleration specific flags in the videoFilter for source to work properly. That videoFilter would make no sense to the still image capture. This is probably an edge case though, and can probably be left for another day, if it ever arose.
So all we need is when vcodec
is set to h264_vaapi:
- Use
-pix_fmt vaapi
instead of-pix_fmt yuv420p
- Use
scale_vaapi=
instead ofscale=
in the filters (only inresInfo.videoFilter
, notresInfo.resizeFilter
).
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
@Sunoo Any chance we can reopen this? Confirm what if any changes you want and I'll include them.
Sorry, lost sight of this and the bot grabbed it. I’ll try to make these changes once HKSV is out.
I'm not sure how I'm only just discovering this now, but it seems force_original_aspect_ratio isn't working all of the time either, probably something to do with ffmpeg build options/library versions:
[Parsed_scale_vaapi_0 @ 0x556072fe9dc0] [error] Option 'force_original_aspect_ratio' not found
So I'm now using this videoFilter to prevent any other filters from being automatically added, downside being it loses the ability to scale the output as per the configured or requested width/height.
"videoFilter": "none,format=nv12|vaapi,hwupload"
Or if scaling is needed, something like these:
"videoFilter": "none,format=nv12|vaapi,hwupload,scale_vaapi=iw/2:ih/2"
"videoFilter": "none,format=nv12|vaapi,hwupload,scale_vaapi=w=1920:h=1080"
I still need to use the wrapper script to change -pix_fmt yuv420p
to -pix_fmt vaapi
, and remove -filter:v
entirely when a snapshot is being taken.
It would be extremely helpful if vaapi was directly available from the plugin. Any chances it will be integrated directly?
Following these steps on Synology DS918+. Plex transcoding fine. But here:
[8/14/2022, 10:23:10 PM] [Camera FFmpeg] [Vorgarten] [AVHWDeviceContext @ 0xc0d280] [error] No VA display found for device /dev/dri/renderD128. [8/14/2022, 10:23:10 PM] [Camera FFmpeg] [Vorgarten] [error] Device creation failed: -22. [8/14/2022, 10:23:10 PM] [Camera FFmpeg] [Vorgarten] [error] Failed to set value '/dev/dri/renderD128' for option 'vaapi_device': Invalid argument [8/14/2022, 10:23:10 PM] [Camera FFmpeg] [Vorgarten] [fatal] Error parsing global options: Invalid argument [8/14/2022, 10:23:10 PM] [Camera FFmpeg] [Vorgarten] FFmpeg exited with code: 1 and signal: null (Error)
First line: Rights problem?
Synology DS918+, ffmpeg5 package: Impossible to convert between the formats supported by the filter 'Parsed_hwupload_1' and the filter 'auto_scale_0'
Can somebody help me please?
Thanks to @mickgiles for the inspiration in #508, I wanted to share what worked for me in newer versions (v3) of homebridge-camera-ffmpeg. If running in docker, remember to pass through your
/dev/dri:/dev/dri
devices.Obviously you'll need a working copy of ffmpeg that has been built with VAAPI (h264_vaapi) support. Lots of resources for compiling your own out there.
ffmpeg.sh Script The script acts a middle man between homebridge and ffmpeg. This is needed to fix a few hardcoded elements in the ffmpeg command, this wouldn't be needed if the plugin gave the additional config control needed. Also #504 is still an issue and should be reopened.
Update the
CMD
variable to be the actual path to ffmpeg.What it does:
-pix_fmt yuv420p
is changed to-pix_fmt vaapi
-filter:v <value>
is removed from the command when a snapshot is being performed (see Still images not updating when using custom videoFilter #504)scale
filters are changed toscale_vaapi
filters#!/bin/bash CMD="/home/user/bin/ffmpeg" SNAPSHOT="FALSE" while [[ $# > 1 ]] do key="$1" if [[ "${key}" == "-frames:v" ]]; then SNAPSHOT="TRUE" fi case ${key} in -pix_fmt) CMD+=" $1 vaapi" shift ;; -filter:v) if [[ "${SNAPSHOT}" == "FALSE" ]]; then CMD+=" $1" else shift fi ;; *) CMD+=" $1" ;; esac shift done CMD="${CMD//scale=/scale_vaapi=}" exec $CMD ${!#}
Plugin config: Update the
videoProcessor
value to be the path to your ffmpeg.sh script.This seems to work with and without
videoFilter
, but I have left it in for now as it's used in all VAAPI examples I've seen."platform": "Camera-ffmpeg", "videoProcessor": "/home/user/bin/ffmpeg.sh", "cameras": [ { "name": "Front Camera", "videoConfig": { "source": "-vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://axis-accc8e6d470a.local/axis-media/media.amp?streamprofile=Media?tcp", "stillImageSource": "-i http://axis-accc8e6d470a.local/axis-cgi/jpg/image.cgi?resolution=1280x720", "encoderOptions": "-bf 0", "vcodec": "h264_vaapi", "videoFilter": "format=nv12|vaapi,hwupload" "maxBitrate": 0, "packetSize": 940 } }
Config notes:
maxBitrate
andpacketSize
are not essential, but gave me much higher quality streamsencoderOptions
set to-bf 0
is the same as using the traditionalbframes=0
, without this the frame rate is very poor- If the H264 profile of your camera is baseline (normally the case with lower power, cheaper devices), you might need to add the following parameter into the
source
value to have ffmpeg/vaapi ignore the profile and try to process the stream anyway:-hwaccel_flags allow_profile_mismatch
View your live camera in the Home app, and then run
intel_gpu_top
on the host machine to confirm the GPU cores are being used. Success.
Hey, is it possible to create same script to fix integration with Camera UI too? It works well on Ffmpeg, but not on CameraUI.
Camera UI Ffmpeg command looks like: I dunno where it's being broken
[13/08/2023, 01:24:49] [CameraUI] Campainha: Stream command: /usr/transcode/ffmpeg_convert.sh -hide_banner -loglevel verbose -vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://admin:[email protected]:554/Streaming/channels/101 -an -sn -dn -r 30 -vcodec h264_vaapi -pix_fmt yuv420p -color_range mpeg -f rawvideo -preset ultrafast -tune zerolatency -filter:v scale='min(1280,iw)':'min(720,ih)':force_original_aspect_ratio=decrease,scale=trunc(iw/2)*2:trunc(ih/2)*2 -b:v 299k -bufsize 598k -maxrate 299k -payload_type 99 -ssrc 15007668 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params +9eF1r951vfENlO4zJMcFyDgiyZMXABZ1SnxPHiq srtp://192.168.2.154:60989?rtcpport=60989&pkt_size=1318 -vn -sn -dn -acodec libfdk_aac -profile:a aac_eld -flags +global_header -f null -ar 16k -b:a 24k -ac 1 -payload_type 110 -ssrc 14493720 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params HzoXAzE3kXcNiUQx2J6WrLzbAzYixjPGFmDC2mh1 srtp://192.168.2.154:59186?rtcpport=59186&pkt_size=188 -progress pipe:1
In the camera-ui configuration, under Options, you can configure the Video Processor. In there you could just put the script above that points to your custom built ffmpeg. I now run on a different platform so can't really try this but it looks like it should work.
I
In the camera-ui configuration, under Options, you can configure the Video Processor. In there you could just put the script above that points to your custom built ffmpeg. I now run on a different platform so can't really try this but it looks like it should work.
I tried it, unfortunately it doesn't work. Same script works with homebridge ffmpeg.
That's what I got:
Campainha: FFmpeg videoanalysis process exited with error! (null) - Impossible to convert between the formats supported by the filter 'Parsed_fps_0' and the filter 'auto_scale_0' - [vf#0:0 @ 0x5604e30278c0] Error reinitializing filters! Failed to inject frame into filter network: Function not implemented Error while filtering: Function not implemented [out#0/image2pipe @ 0x5604e319e980] Nothing was written into output file, because at least one of its streams received no packets.
Thanks to @mickgiles for the inspiration in #508, I wanted to share what worked for me in newer versions (v3) of homebridge-camera-ffmpeg. If running in docker, remember to pass through your
/dev/dri:/dev/dri
devices. Obviously you'll need a working copy of ffmpeg that has been built with VAAPI (h264_vaapi) support. Lots of resources for compiling your own out there. ffmpeg.sh Script The script acts a middle man between homebridge and ffmpeg. This is needed to fix a few hardcoded elements in the ffmpeg command, this wouldn't be needed if the plugin gave the additional config control needed. Also #504 is still an issue and should be reopened. Update theCMD
variable to be the actual path to ffmpeg. What it does:
-pix_fmt yuv420p
is changed to-pix_fmt vaapi
-filter:v <value>
is removed from the command when a snapshot is being performed (see Still images not updating when using custom videoFilter #504)scale
filters are changed toscale_vaapi
filters#!/bin/bash CMD="/home/user/bin/ffmpeg" SNAPSHOT="FALSE" while [[ $# > 1 ]] do key="$1" if [[ "${key}" == "-frames:v" ]]; then SNAPSHOT="TRUE" fi case ${key} in -pix_fmt) CMD+=" $1 vaapi" shift ;; -filter:v) if [[ "${SNAPSHOT}" == "FALSE" ]]; then CMD+=" $1" else shift fi ;; *) CMD+=" $1" ;; esac shift done CMD="${CMD//scale=/scale_vaapi=}" exec $CMD ${!#}
Plugin config: Update the
videoProcessor
value to be the path to your ffmpeg.sh script. This seems to work with and withoutvideoFilter
, but I have left it in for now as it's used in all VAAPI examples I've seen."platform": "Camera-ffmpeg", "videoProcessor": "/home/user/bin/ffmpeg.sh", "cameras": [ { "name": "Front Camera", "videoConfig": { "source": "-vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://axis-accc8e6d470a.local/axis-media/media.amp?streamprofile=Media?tcp", "stillImageSource": "-i http://axis-accc8e6d470a.local/axis-cgi/jpg/image.cgi?resolution=1280x720", "encoderOptions": "-bf 0", "vcodec": "h264_vaapi", "videoFilter": "format=nv12|vaapi,hwupload" "maxBitrate": 0, "packetSize": 940 } }
Config notes:
maxBitrate
andpacketSize
are not essential, but gave me much higher quality streamsencoderOptions
set to-bf 0
is the same as using the traditionalbframes=0
, without this the frame rate is very poor- If the H264 profile of your camera is baseline (normally the case with lower power, cheaper devices), you might need to add the following parameter into the
source
value to have ffmpeg/vaapi ignore the profile and try to process the stream anyway:-hwaccel_flags allow_profile_mismatch
View your live camera in the Home app, and then run
intel_gpu_top
on the host machine to confirm the GPU cores are being used. Success.Hey, is it possible to create same script to fix integration with Camera UI too? It works well on Ffmpeg, but not on CameraUI.
Camera UI Ffmpeg command looks like: I dunno where it's being broken
[13/08/2023, 01:24:49] [CameraUI] Campainha: Stream command: /usr/transcode/ffmpeg_convert.sh -hide_banner -loglevel verbose -vaapi_device /dev/dri/renderD128 -hwaccel vaapi -hwaccel_output_format vaapi -rtsp_transport tcp -i rtsp://admin:[email protected]:554/Streaming/channels/101 -an -sn -dn -r 30 -vcodec h264_vaapi -pix_fmt yuv420p -color_range mpeg -f rawvideo -preset ultrafast -tune zerolatency -filter:v scale='min(1280,iw)':'min(720,ih)':force_original_aspect_ratio=decrease,scale=trunc(iw/2)*2:trunc(ih/2)*2 -b:v 299k -bufsize 598k -maxrate 299k -payload_type 99 -ssrc 15007668 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params +9eF1r951vfENlO4zJMcFyDgiyZMXABZ1SnxPHiq srtp://192.168.2.154:60989?rtcpport=60989&pkt_size=1318 -vn -sn -dn -acodec libfdk_aac -profile:a aac_eld -flags +global_header -f null -ar 16k -b:a 24k -ac 1 -payload_type 110 -ssrc 14493720 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params HzoXAzE3kXcNiUQx2J6WrLzbAzYixjPGFmDC2mh1 srtp://192.168.2.154:59186?rtcpport=59186&pkt_size=188 -progress pipe:1
Finally got HW working thanks to this post! (Synology package, not Docker)