go2rtc
go2rtc copied to clipboard
[Question] How to enable Hardware Acceleration? (Nvidia GPU)
As the title states; I am looking at a way to use my GPU to decode the video feed, as it is my CPU is using 80% for a single feed in "Glances".
Thanks a lot for this project!
Do you using ffmpeg stream source?
I do!
You can use raw param like in docs for rotation:
ffmpeg:rtsp://rtsp:[email protected]/av_stream/ch0#raw=-vf transpose=1#video=h264
Hi there, Thanks for the answer!
I did find this command on the NVIDIA docs:
ffmpeg -y -vsync 0 -hwaccel cuda -hwaccel_output_format cuda -i input.mp4 -c:a copy -c:v h264_nvenc -b:v 5M output.mp4
But I am unsure on what parts I need to use after my feed input.
ffmpeg -y -vsync 0 -hwaccel cuda -hwaccel_output_format cuda -i:rtsp://user:[email protected]/h264Preview_01_main#raw= -c:a copy -c:v h264_nvenc
Is this going in the right direction? (Tells me the scheme is nut supported)
06:26:08.563 ERR [streams] probe producer error="unsupported scheme: ffmpeg -y -vsync 0 -hwaccel cuda -hwaccel_output_format cuda -i:rtsp://user:[email protected]/h264Preview_01_main#raw= -c:a copy -c:v h264_nvenc" url="ffmpeg -y -vsync 0 -hwaccel cuda -hwaccel_output_format cuda -i:rtsp://user:[email protected]/h264Preview_01_main#raw= -c:a copy -c:v h264_nvenc" 06:26:08.563 WRN [rtsp] error="couldn't find the matching tracks" stream=front_door_camera
Edit: I got this to work in ubuntu:
ffmpeg -hwaccel cuda -hwaccel_output_format cuda -i rtsp://user:[email protected]/h264Preview_01_main -rtsp_transport tcp -use_wallclock_as_timestamps 1 -c:v h264_nvenc -b:v 5M output.mp4
But when trying to use
ffmpeg -hwaccel cuda -hwaccel_output_format cuda -i rtsp://user:[email protected]/h264Preview_01_main -rtsp_transport tcp -use_wallclock_as_timestamps 1 -c:v h264_nvenc -b:v 5M -f rtsp {output}
It does not work. It gives me instead:
07:29:49.099 ERR [streams] probe producer error=timeout url="exec:ffmpeg -hwaccel cuda -hwaccel_output_format cuda -i rtsp://user:[email protected]/h264Preview_01_main -rtsp_transport tcp -use_wallclock_as_timestamps 1 -c:v h264_nvenc -b:v 5M -f rtsp {output}" 07:29:49.099 WRN [rtsp] error="couldn't find the matching tracks" stream=front_door_camera
Any idea?
Are you using Hassio or other Linux?
Looks like Nvidia acceleration is very complicated. I will try to test it on my old server with CUDA support. https://docs.frigate.video/configuration/nvdec
I use it on a Linux VM (in docker).
The Docker NVIDIA stuff is already installed and works for Frigate on that VM
Hi there, I gave a bit more tries to this, but I think it boils down to the NVIDIA libraries not being installed in the Docker install.
Hi there, I gave a bit more tries to this, but I think it boils down to the NVIDIA libraries not being installed in the Docker install.
You could roll your own ffmpeg binary though and point to it in the go2rtc.yaml config?
I tried doing so with the Binaries of Go2Rtc, but unfortunately I get this
16:04:05.895 ERR [streams] probe producer error="unsupported scheme: ffmpeg -hwaccel cuda -hwaccel_output_format cuda -i : libvpx -b:v 1M -maxrate 4M -c:a libvorbis#video=vp8" url="ffmpeg -hwaccel cuda -hwaccel_output_format cuda -i :rtsp://admin:@192.168.0.210/h264Preview_01_main#raw=-c:v libvpx -b:v 1M -maxrate 4M -c:a libvorbis#video=vp8"
What I meant was you can grab or build a ffmpeg binary with the required hwaccels and plop it in the configuration directory and use the
ffmpeg:
bin: ffmpeg
to point to the path of that binary. The one packaged with go2rtc (and frigate too) doesn't come with that but there is a build that does. you can probably grab it from the frigate docker:
https://docs.frigate.video/configuration/nvdec/
Although I am not sure if that specific one is what you need. nvidia gives instructions on how to build one though.
@maxi1134 read about exec source
I tried a lot in the last 6 days, too much to actually list. But nothing seems to work.
Is anyone looking for a tip/coffee in exchange for help?
I tried this as well, doing what @calisro suggested. I compiled a binary with the flags Nvidia recommended, but simply dropping the binary doesnt work. There are missing libraries when i run ldd
on the mounted binary
Error relocating ffmpeg: av_buffersink_get_ch_layout: symbol not found
Error relocating ffmpeg: __vsnprintf_chk: symbol not found
Error relocating ffmpeg: av_fifo_freep2: symbol not found
Error relocating ffmpeg: __fprintf_chk: symbol not found
Error relocating ffmpeg: av_channel_layout_describe: symbol not found
Error relocating ffmpeg: av_channel_layout_check: symbol not found
Error relocating ffmpeg: av_channel_layout_copy: symbol not found
Error relocating ffmpeg: av_fifo_grow2: symbol not found
Error relocating ffmpeg: av_fifo_can_write: symbol not found
Error relocating ffmpeg: av_channel_layout_index_from_channel: symbol not found
Error relocating ffmpeg: av_fifo_write: symbol not found
Error relocating ffmpeg: av_channel_layout_standard: symbol not found
Error relocating ffmpeg: av_channel_layout_describe_bprint: symbol not found
Error relocating ffmpeg: av_channel_layout_compare: symbol not found
Error relocating ffmpeg: av_fifo_peek: symbol not found
Error relocating ffmpeg: av_fifo_can_read: symbol not found
Error relocating ffmpeg: __vfprintf_chk: symbol not found
Error relocating ffmpeg: av_channel_layout_uninit: symbol not found
Error relocating ffmpeg: av_fifo_drain2: symbol not found
Error relocating ffmpeg: av_channel_description: symbol not found
Error relocating ffmpeg: __printf_chk: symbol not found
Error relocating ffmpeg: av_channel_layout_from_mask: symbol not found
Error relocating ffmpeg: av_fifo_alloc2: symbol not found
Error relocating ffmpeg: av_channel_name: symbol not found
Error relocating ffmpeg: av_channel_layout_from_string: symbol not found
Error relocating ffmpeg: __snprintf_chk: symbol not found
Error relocating ffmpeg: av_channel_layout_default: symbol not found
Error relocating ffmpeg: av_fifo_read: symbol not found
Not quite sure what to do from here but still trying
@maxi1134 Since hw acceleration is not supported in go2rtc, my plan is to configure my stream in this order:
- Frigate as the entry point to the RTSP stream.
- Configure Frigate to perform the encoding for the RTMP stream
# Optional: ffmpeg configuration
ffmpeg:
# Optional: global ffmpeg args (default: shown below)
global_args: -hide_banner -loglevel warning
# Optional: global hwaccel args (default: shown below)
# NOTE: See hardware acceleration docs for your specific device
hwaccel_args: []
# Optional: global input args (default: shown below)
input_args: -avoid_negative_ts make_zero -fflags +genpts+discardcorrupt -rtsp_transport tcp -timeout 5000000 -use_wallclock_as_timestamps 1
# Optional: global output args
output_args:
# Optional: output args for detect streams (default: shown below)
detect: -f rawvideo -pix_fmt yuv420p
# Optional: output args for record streams (default: shown below)
record: -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c copy -an
# Optional: output args for rtmp streams (default: shown below)
rtmp: -c copy -f flv
- Remove camera configuration from HA config
- Use the Frigate integration into HA to add cameras
- go2rtc along with RTSPtoWebRTC to "on the fly" convert these cameras to WebRTC
Hw accel should work just fine in go2rtc, just used a prebuilt binary like BtBn or jellyfin. We're also building go2rtc directly into frigate in the next release.
@NickM-27 does this mean the streams from frigate will be webrtc instead of rtmp?
@NickM-27 does this mean the streams from frigate will be webrtc instead of rtmp?
webRTC will be supported as a way to view in the frigate frontend and streams will be restreamed using rtsp
That makes more sense than what i said, thanks. Ill look into a prebuilt binary, i think i was concentrating on bulilding one didnt think about that
@NickM-27 does this mean the streams from frigate will be webrtc instead of rtmp?
webRTC will be supported as a way to view in the frigate frontend and streams will be restreamed using rtsp
Does this mean that my camera would do something like this
camera -> Frigate(withWebRTC)->home-Assistant?
Instead of doing
camera->webRTC->frigate
in parrallel of
camera->webrtc->homeassistant
The frigate card should be able to support webRTC from frigate directly otherwise in HA you'll need to use frigates RTSP stream along with rtsp2webrtc or the webRTCCard
The frigate card should be able to support webRTC from frigate directly otherwise in HA you'll need to use frigates RTSP stream along with rtsp2webrtc or the webRTCCard
A-fucking-one.
That is great news, transcoding, here we come. My 4k Cameras do not play well with my Samsung s7 FE
My 4k Cameras do not play well with my Samsung s7 FE
Yea no kidding cant stream 2k even on my Pixel 6 @AlexxIT any intents to support GPU hw acceleration in ffmpeg ootb with go2rtc? If not I think we can close this issue
The first priority is stable operation. No one needs hardware acceleration for dropping connection
@AlexxIT, can we have some kind of global option we can set in the go2rtc.yaml
, like:
hw_accel_args: -whatever_args -i_want
That go2rtc would automatically inject in all ffmpeg calls without having to repeat them? Similar to how Frigate handles it.
@AlexxIT, can we have some kind of global option we can set in the
go2rtc.yaml
, like:hw_accel_args: -whatever_args -i_want
That go2rtc would automatically inject in all ffmpeg calls without having to repeat them? Similar to how Frigate handles it.
It depends what you're doing. Frigate only does one way decoding. I imagine go2rtc would be a lot more transcoding which would require 2 sets of args and may be a bit more complicated.
Got it.
There's another thing: sometimes we need to add environment variables for the hwaccel to work correctly (as stated here with LIBVA_DRIVER_NAME. It would be nice if this could be sorted out for the add-on users (since you can't pass arbitrary env vars like docker run).
Maybe during the initialization of the go binary you can read the environment:
block in go2rtc.yaml
and call the setEnv function to have it applied for all underlying processes. Again, similar to Frigate (at least from the user perspective).
Maybe during the initialization of the go binary you can read the
environment:
block ingo2rtc.yaml
and call the setEnv function to have it applied for all underlying processes. Again, similar to Frigate (at least from the user perspective).
It would probably be easier to just provide a LIBVA_DRIVER_NAME
env variable in the addon and default it to i965
If that isn't harm at all, yes, much better! Frigate could do it as well then. For now I have cloned the alexxit-hassio-addons
repo in my /addons
, and I'm adding:
environment:
LIBVA_DRIVER_NAME: i965
To its config.yaml
. And of course installing go2rtc from there.
Actually this may be trickier than I thought. Frigate image seems to add lots of different dependencies according to the architecture it's building from:
https://github.com/blakeblackshear/frigate/blob/47c1985c265f060daab8b6d492fdf424c20fe123/docker/Dockerfile#L95
Most likely hwaccel won't work for me unless these dependencies are set. To be worse, go2trc currently runs in Alpine, which make things harder as I can't simply copy what Frigate is doing.
I think I won't try to use hwaccel in go2rtc for now.