jetson-frigate
jetson-frigate copied to clipboard
libnvbuf_utils.so.1.0.0: cannot open shared object file: No such file or directory
Hi, after building on my jetson nano (nvidia sdcard image, ubuntu 18.04) according to your instructions, i get the following error when running frigate:
ffmpeg.test.detect ERROR : ffmpeg: error while loading shared libraries: libnvbuf_utils.so.1.0.0: cannot open shared object file: No such file or directory frigate.video INFO : test: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures frigate.video INFO : test: ffmpeg process is not running. exiting capture thread...
libnvbuf_utils.so is present on the nanos host filesystem.
checked the container filesystem, looks like 2 libs are missing in my build:
ldd /usr/local/bin/ffmpeg linux-vdso.so.1 (0x0000007f8805b000) libavdevice.so.58 => /usr/local/lib/libavdevice.so.58 (0x0000007f87fc4000) libavfilter.so.7 => /usr/local/lib/libavfilter.so.7 (0x0000007f87d40000) libavformat.so.58 => /usr/local/lib/libavformat.so.58 (0x0000007f87b68000) libavcodec.so.58 => /usr/local/lib/libavcodec.so.58 (0x0000007f86c14000) libavresample.so.4 => /usr/local/lib/libavresample.so.4 (0x0000007f86bf6000) libpostproc.so.55 => /usr/local/lib/libpostproc.so.55 (0x0000007f86bde000) libswresample.so.3 => /usr/local/lib/libswresample.so.3 (0x0000007f86bbd000) libswscale.so.5 => /usr/local/lib/libswscale.so.5 (0x0000007f86b5a000) libavutil.so.56 => /usr/local/lib/libavutil.so.56 (0x0000007f868f1000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000007f86844000) libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x0000007f86814000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000007f866a1000) /lib/ld-linux-aarch64.so.1 (0x0000007f8802b000) libxcb.so.1 => /lib/aarch64-linux-gnu/libxcb.so.1 (0x0000007f8666a000) libxcb-shm.so.0 => /usr/local/lib/libxcb-shm.so.0 (0x0000007f86657000) libxcb-shape.so.0 => /usr/local/lib/libxcb-shape.so.0 (0x0000007f86643000) libxcb-xfixes.so.0 => /usr/local/lib/libxcb-xfixes.so.0 (0x0000007f8662c000) libvidstab.so.1.1 => /usr/local/lib/libvidstab.so.1.1 (0x0000007f8660a000) libzmq.so.5 => /usr/local/lib/libzmq.so.5 (0x0000007f86565000) libfreetype.so.6 => /lib/aarch64-linux-gnu/libfreetype.so.6 (0x0000007f864a5000) libz.so.1 => /lib/aarch64-linux-gnu/libz.so.1 (0x0000007f8647b000) libssl.so.1.1 => /lib/aarch64-linux-gnu/libssl.so.1.1 (0x0000007f863e1000) libcrypto.so.1.1 => /lib/aarch64-linux-gnu/libcrypto.so.1.1 (0x0000007f86154000) libvpx.so.6 => /usr/local/lib/libvpx.so.6 (0x0000007f85f29000) libopencore-amrwb.so.0 => /usr/local/lib/libopencore-amrwb.so.0 (0x0000007f85f06000) libnvmpi.so.1 => /usr/local/lib/libnvmpi.so.1 (0x0000007f85eae000) libaom.so.0 => /usr/local/lib/libaom.so.0 (0x0000007f85a44000) libfdk-aac.so.1 => /usr/local/lib/libfdk-aac.so.1 (0x0000007f8599c000) libmp3lame.so.0 => /usr/local/lib/libmp3lame.so.0 (0x0000007f85900000) libopencore-amrnb.so.0 => /usr/local/lib/libopencore-amrnb.so.0 (0x0000007f858cd000) libopenjp2.so.7 => /usr/local/lib/libopenjp2.so.7 (0x0000007f8586c000) libopus.so.0 => /usr/local/lib/libopus.so.0 (0x0000007f8581a000) libtheoraenc.so.1 => /usr/local/lib/libtheoraenc.so.1 (0x0000007f857d6000) libtheoradec.so.1 => /usr/local/lib/libtheoradec.so.1 (0x0000007f857b1000) libvorbis.so.0 => /usr/local/lib/libvorbis.so.0 (0x0000007f8576e000) libvorbisenc.so.2 => /usr/local/lib/libvorbisenc.so.2 (0x0000007f856ba000) libwebp.so.7 => /usr/local/lib/libwebp.so.7 (0x0000007f8565d000) libx264.so.148 => /usr/local/lib/libx264.so.148 (0x0000007f854f7000) libx265.so.176 => /usr/local/lib/libx265.so.176 (0x0000007f85044000) libxvidcore.so.4 => /usr/local/lib/libxvidcore.so.4 (0x0000007f84f5a000) libkvazaar.so.4 => /usr/local/lib/libkvazaar.so.4 (0x0000007f84ee7000) libXau.so.6 => /lib/aarch64-linux-gnu/libXau.so.6 (0x0000007f84ed1000) libXdmcp.so.6 => /lib/aarch64-linux-gnu/libXdmcp.so.6 (0x0000007f84ebb000) libgomp.so.1 => /lib/aarch64-linux-gnu/libgomp.so.1 (0x0000007f84e6d000) libstdc++.so.6 => /lib/aarch64-linux-gnu/libstdc++.so.6 (0x0000007f84c8a000) libgcc_s.so.1 => /lib/aarch64-linux-gnu/libgcc_s.so.1 (0x0000007f84c65000) libpng16.so.16 => /lib/aarch64-linux-gnu/libpng16.so.16 (0x0000007f84c1f000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000007f84c0b000) libnvbuf_utils.so.1.0.0 => not found libv4l2.so.0 => not found libogg.so.0 => /usr/local/lib/libogg.so.0 (0x0000007f84bf5000) libbsd.so.0 => /lib/aarch64-linux-gnu/libbsd.so.0 (0x0000007f84bcc000)
Same error here.
looks like ffmpeg is missing some libraries that are not in /usr/local. i rigged up some script to gather and copy them. Now ffmpeg works, i can see my cameras stream and even the coral tpu works, but if i add
hwaccel_args:
- -c:v
- h264_nvmpi
to actually make use of the compiled in support for the nvidia gpu i get the following error:
frigate.video INFO : test: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures frigate.video INFO : test: ffmpeg process is not running. exiting capture thread... ffmpeg.test.detect ERROR : libv4l2: error getting capabilities: Inappropriate ioctl for device
i attached my Dockerfile for the "make jetson-ffmpeg" step, in case anybody else wants to try.
I believe I found the issue. The docker image for building ffmpeg is balena for jetson nano, however, the image running the binary is based on ubuntu and the frigate base:
FROM blakeblackshear/frigate-wheels:${WHEELS_VERSION}-${ARCH} as wheels
FROM blakeblackshear/frigate-ffmpeg:${FFMPEG_VERSION}-${ARCH} as ffmpeg
FROM frigate-web as web
FROM ubuntu:20.04
LABEL maintainer "[email protected]"
Moreover, the jetson_frigate makefile uses the base container and the aarch dockerfile:
jetson_frigate: version web
docker build --tag frigate-base --build-arg ARCH=aarch64 --build-arg FFMPEG_VERSION=1.0.0 --build-arg WHEELS_VERSION=1.0.3 --file docker/Dockerfile.base .
docker build --tag frigate --file docker/Dockerfile.aarch64 .
I'm not so sure this would provide, therefore, the required dependencies for running ffmpeg.
You must be doing something wrong, the docker images works as expected
root@jetson-nano:~/jetson-frigate# docker exec -it frigate ffmpeg -decoders|grep nvmpi
configuration: --enable-nvmpi --disable-debug --disable-doc --disable-ffplay --enable-shared --enable-avresample --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-gpl --enable-libfreetype --enable-libvidstab --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libxcb --enable-libx265 --enable-libxvid --enable-libx264 --enable-nonfree --enable-openssl --enable-libfdk_aac --enable-postproc --enable-small --enable-version3 --enable-libzmq --extra-libs=-ldl --prefix=/opt/ffmpeg --enable-libopenjpeg --enable-libkvazaar --enable-libaom --extra-libs=-lpthread --enable-v4l2_m2m --enable-neon --extra-cflags=-I/opt/ffmpeg/include --extra-ldflags=-L/opt/ffmpeg/lib
V..... h264_nvmpi (codec h264)
V..... hevc_nvmpi (codec hevc)
V..... mpeg2_nvmpi (codec mpeg2video)
V..... mpeg4_nvmpi (codec mpeg4)
V..... vp8_nvmpi (codec vp8)
V..... vp9_nvmpi (codec vp9)
I will upload them to the docker hub soon.
what OS and Jetpack version are you running this on ? i tried Jetpack 4.5.1, as in the latest jetson nano nvidia sdcard image and it did not work. The container you use to build supports jetpack 4.4.x., as far as i understood he balena hompage. As parts of the host OS are mapped into the container by the nvidia runtime, this will cause problems.
Try to use newly uploaded docker image
docker pull nulldevil/frigate
Host system details:
Ubuntu 18.04.5 LTS (Bionic Beaver)
- NVIDIA Jetson Nano (Developer Kit Version)
* Jetpack 4.5.1 [L4T 32.5.1]
* NV Power Mode: MAXN - Type: 0
* jetson_stats.service: active
- Libraries:
* CUDA: 10.2.89
* cuDNN: 8.0.0.180
* TensorRT: 7.1.3.0
* Visionworks: 1.6.0.501
* OpenCV: 3.4.8 compiled CUDA: YES
* VPI: ii libnvvpi1 1.0.15 arm64 NVIDIA Vision Programming Interface library
* Vulkan: 1.2.70
tested it, ffmpeg displays h264_nvmpi decoder but does crash when used with frigate. here is the log output:
detector.coral INFO : Starting detection process: 36
frigate.app INFO : Camera processor started for kugelvision: 39
frigate.edgetpu INFO : Attempting to load TPU as usb
frigate.app INFO : Capture process started for kugelvision: 40
frigate.video INFO : kugelvision: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video INFO : kugelvision: ffmpeg process is not running. exiting capture thread...
frigate.edgetpu INFO : TPU found
ffmpeg.kugelvision.detect ERROR : NvMMLiteOpen : Block : BlockType = 261
ffmpeg.kugelvision.detect ERROR : NVMEDIA: Reading vendor.tegra.display-size : status: 6
ffmpeg.kugelvision.detect ERROR : NvMMLiteBlockCreate : Block : BlockType = 261
frigate.http DEBUG : Received mqtt message on frigate/stats.
frigate.video INFO : kugelvision: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video INFO : kugelvision: ffmpeg process is not running. exiting capture thread...
using a google vision camera sending with v4l2rtspserver -F 15 -H 720 -W 1280 -P 8555 /dev/video0
here is my camera config:
cameras:
kugelvision:
ffmpeg:
hwaccel_args:
- -c:v
- h264_nvmpi
inputs:
- path: rtsp://192.168.178.77:8555/unicast
roles:
- detect
- rtmp
width: 1280
height: 720
fps: 10
it works properly when i remove the hwaccel_args, but to my knowledge that will lead to the gpu not beeing used.
Yes, it does support the decoding feature, but further investigation is needed. You can play directly with ffmpeg to test it.
Anyone get this working? I tried following the install instructions but the make ffmpeg step errors out saying I have broken packages. I tried modifying the docker.ffmpeg to use install_package instead of apt-get and was able to include several dependencies but one package it could not find libbiffi I think it was. So I gave up trying to make the docker file and just did docker pull nulldevil/frigate and got it running but ffmpeg crashes with broken frame errors. I might try switching the base from balena to something else to see if it has all the dependencies.