QMPlay2 icon indicating copy to clipboard operation
QMPlay2 copied to clipboard

QMPlay2 seems to be missing V4L2 mem2mem decoder wrapper support

Open usual-user opened this issue 2 years ago • 18 comments

When I run e.g. "mpv --vd=help" it enumerates several hardware accelerated decoder wrappers that ffmpeg exposes:

   h263_v4l2m2m (h263) - V4L2 mem2mem H.263 decoder wrapper
   h264_v4l2m2m (h264) - V4L2 mem2mem H.264 decoder wrapper
   hevc_v4l2m2m (hevc) - V4L2 mem2mem HEVC decoder wrapper
   mpeg4_v4l2m2m (mpeg4) - V4L2 mem2mem MPEG4 decoder wrapper
   mpeg1_v4l2m2m (mpeg1video) - V4L2 mem2mem MPEG1 decoder wrapper
   mpeg2_v4l2m2m (mpeg2video) - V4L2 mem2mem MPEG2 decoder wrapper
   vc1_v4l2m2m (vc1) - V4L2 mem2mem VC1 decoder wrapper
   vp8_v4l2m2m (vp8) - V4L2 mem2mem VP8 decoder wrapper
   vp9_v4l2m2m (vp9) - V4L2 mem2mem VP9 decoder wrapper

"mpv --hwdec=auto --hwdec-codecs=all [file|URL|PLAYLIST|-]" automatically selects the correct decoder and plays videos hardware-accelerated. Mpv did not need any modification although e.g. the VP9 support was subsequently added to ffmpeg. QMPlayer2 seems to only select software ffmpeg decoders that lead to violent frame drops. Is it possible to use the hardware-accelerated ffmpeg decoders in QMplay2 as well?

usual-user avatar Dec 28 '21 22:12 usual-user

QMPlay2 supports now: VA-API (zero copy), VDPAU, CUVID, D3D11VA. Is it what is needed for R-Pi?

zaps166 avatar Dec 28 '21 22:12 zaps166

No, this is aarch64 architecture in SOC world. VA-API (zero copy), VDPAU, CUVID, D3D11VA are for IBM PC architecture where the GPU is also the scan out engine. The SOC world consists of components of different IPs (intelectual properties) that usually communicate with each other via memory buffers. Dmabuf is used to pass them around efficiently (zero copy). The display scan out engine usually gets exposed via the KMS/DRM subsystem. While GPUs are exposed as render nodes (/dev/dri/renderD128) and decoder, encoder colorspace converter, scaler etc. usually as V4L2 mem2mem (/dev/video0, video1, ...). See videoX-infos.txt for reference what my RK3399 currently exposes. In the SOC world, it makes absolutely no sense to send data in video format to the GPU, as it usually does not even support the data format. And 3D GPU emulation is never more efficient than dedicated hardware (separate IP). It therefore makes more sense to pass decoder output directly to the compositor (wayland), which can then output it accelerated via KMS/DRM. Frameworks such as gstreamer and ffmpeg typically wrap the V4L2 mem2mem devices for use by applications. Once the applications have implemented the API for their framework, they will work for all devices without further intervention once the kernel exposes the functionality. Just like gst-play-1.0 and mpv work for me without any special adjustments since the kernel provides support for my SOC. videoX-infos.txt

usual-user avatar Dec 29 '21 00:12 usual-user

Nice, on Linux we have: VA-API, VDPAU, CUVID, NVDEC, V4L2M2M, OpenMAX, XvMC (obsolete), XvBA (obsolete), Vulkan (new video decoder), ... We need even more APIs to video decode! But at least V4L2M2M looks like it's the most integrated with Linux.

Anyway, good idea.

zaps166 avatar Dec 29 '21 11:12 zaps166

V4L2M2M is exposed directly from the mainline kernel, so as soon as a new IP receives mainline support, userspace support is automatically available if it already existed for another IP. Kernel 5.17.0 will include new support for stateless VP9 hardware video decoders for multiple SOCs. The gstreamer framework already has support for this in its main branch, i.e. with the 1.20 release this is available by default. As an early adopter this is already working for me, see video-pipeline-vp9.pdf for reference. v4l2slvp9dec is the interesting element. FFmpeg unfortunately still needs some out of tree patches for its function, but it should only be a matter of time before the mainline support becomes available. The AVC (h264) decoder should already be available. I haven't yet understood how ranking works for selecting a decoder in the ffmpeg framework, but it should end up being just a selection of a specific ffmpeg decoder element: h264 - H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 or h264_v4l2m2m (h264) - V4L2 mem2mem H.264 decoder wrapper Unfortunately, I can not estimate what effort this means for QMPlay2. video-pipeline-vp9.pdf FWIW I don't do anything with Raspbeery Pi, but just stumbled across this post. So everything said also applies to Raspbeery Pi.

usual-user avatar Dec 29 '21 14:12 usual-user

I'll look at this API on my R-Pi 4 (I can see a lot of /dev/video devices). For dmabuff I guess I need to make Vulkan work first on R-Pi and then I can try V4L2M2M. Do you have Vulkan on RK3399?

zaps166 avatar Dec 29 '21 19:12 zaps166

If your ffmpeg framework is build with v4l2 support, it will take over V4L2M2M handling for you and you have only to use the V4L2 mem2mem decoder wrapper. Check if your ffmpeg framework is build with --enable-v4l2* and --enable-libudev, see ffmpeg.log for reference. When everything is in place, ffmpeg -c:v h264 ... and ffmmpeg -c:v h264_v4l2m2m ... switch between software and hardware decoding. If QMPlay2 gets similar selecting functionality, nothing else should be required. Of cause the selecting should happen automaticly, depending on hardware accelerator availabilety. Mpv also does this with the above mentioned parameters. No Vulkan is required in this context. This is just about which IP performs the video decoding, and we want the dedicated hardware acelerator. The display of the decoder output is a second thing and should be secondary under Wayland for the time being. BTW, you can inspect your devices via v4l2-ctl /dev/video1 --all from v4l-utils. ffmpeg.log

usual-user avatar Dec 29 '21 22:12 usual-user

I can't decode and/or encode. Am I doing something wrong? Arch Linux ARM for R-Pi 4.

$ ffmpeg -y -c:v h264_v4l2m2m -i test.mkv -c:v h264_v4l2m2m -an test2.mkv
ffmpeg version n4.4.1 Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 10.2.0 (GCC)
  configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-amf --enable-avisynth --enable-cuda-llvm --enable-fontconfig --enable-gmp --enable-gnutls --enable-gpl --enable-ladspa --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libiec61883 --enable-libjack --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librsvg --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-shared --enable-version3 --host-cflags='"-fPIC"'
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
Input #0, matroska,webm, from 'test.mkv':
  Metadata:
    ENCODER         : Lavf58.76.100
  Duration: 00:02:24.92, start: 0.000000, bitrate: 4143 kb/s
  Stream #0:0: Video: h264 (High), yuv420p(tv, bt709, progressive), 1920x1080 [SAR 1:1 DAR 16:9], 60 fps, 60 tbr, 1k tbn, 120 tbc (default)
    Metadata:
      DURATION        : 00:02:24.907000000
  Stream #0:1: Audio: opus, 48000 Hz, stereo, fltp (default)
    Metadata:
      DURATION        : 00:02:24.921000000
[h264_v4l2m2m @ 0x20f8c90] Using device /dev/video10
[h264_v4l2m2m @ 0x20f8c90] driver 'bcm2835-codec' on card 'bcm2835-codec-decode' in mplane mode
[h264_v4l2m2m @ 0x20f8c90] requesting formats: output=H264 capture=YU12
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (h264_v4l2m2m) -> h264 (h264_v4l2m2m))
Press [q] to stop, [?] for help
Error while decoding stream #0:0: Resource temporarily unavailable
    Last message repeated 8672 times
[h264_v4l2m2m @ 0x20c7f60] Using device /dev/video11
[h264_v4l2m2m @ 0x20c7f60] driver 'bcm2835-codec' on card 'bcm2835-codec-encode' in mplane mode
[h264_v4l2m2m @ 0x20c7f60] requesting formats: output=YU12 capture=H264
[h264_v4l2m2m @ 0x20c7f60] Failed to set gop size: Invalid argument
Could not write header for output file #0 (incorrect codec parameters ?): Invalid data found when processing input
Error initializing output stream 0:0 -- 
Conversion failed!

zaps166 avatar Dec 30 '21 00:12 zaps166

If I interpret the log output correctly, there is support for stateful decoders in your ffmpeg framework. So far, I have only used the ffmpeg framework very superficially and am hardly familiar with the corresponding tools. My use cases are mostly based on the gstreamer framework as it is usually much more advanced for my needs. Just recently, when the gstreamer support for stateless decoders was temporarily down, I turned to FFmpeg to test some stateless decoder patches for it. My experience is mostly based on the use of mpv. Can you possibly run mpv --hwdec=auto --hwdec-codecs=all test.mkv? After typing shift i, you should get a similar output as: h264 The interresting part is hwdec drm-copy. In my environment, mpv performs similarly to gst-play-1.0. I conclude that the VPU is used. Unfortunately, I have not yet been able to figure out how to create a similarly meaningful debug output as video-pipeline-vp9.pdf to see what the video pipeline used looks like. Because of the non-existent graphical user interface of mpv I came across QMPlay2 and would like to use it in the future, but this is only possible if we can work out the VPU support.

usual-user avatar Dec 30 '21 09:12 usual-user

Unfortunately, something is wrong on my R-Pi and it doesn't work at all.

zaps166 avatar Dec 30 '21 20:12 zaps166

I am sorry for you. I don't know in what shape your distribution is for the Raspberry Pi platform, but if I'm not mistaken, fedora (my preferred used distribution) doesn't have official support for it so far. For fedora, the Raspberry Pi platform doesn't have enough mainline support to be even basically official supported. I know some users have made it work with some tinkering, but it takes some effort. I can certainly help with VPU support, but due to the other challenges, I don't see an easy solution here. Due to the low maturity, at least the latest mainline software release versions are required, if not the application of pending patches. If you are interested in further investigation, I can give some advice on how to proceed with your distribution. If an up-to-date gstreamer framework is available in your distribution, I would like to use it to collect some debugging information of the system. To do this, you have to execute the following command: GST_DEBUG_DUMP_DOT_DIR=. gst-play-1.0 test.mkv. If you provide me with the *.dot files that were created after the end of the program, they should provide me with the necessary information to estimate whether VPU support should also be possible with the ffmpeg framework. Alternatively, you can visualize them yourself beforehand by using the dot -Tpdf <file>.dot > <file>.pdf command from the Graphviz package.

usual-user avatar Dec 31 '21 12:12 usual-user

I tried again and it's working, but it doesn't work on some H264 videos (especially YouTube videos).

zaps166 avatar Jan 01 '22 19:01 zaps166

I looked into FFmpeg V4L2M2M code - for me it looks poor, it doesn't expose any possibility to use DMABUF. Also I have some bugs when using it (sometimes software freeze or audio desync). Somebody tried to add required API to use DMABUF in 2018, but it's not merged.

If you want to try V4L2M2M in QMPlay2, use master branch and put:

if (streamInfo.codec_name == "h264")
    streamInfo.codec_name = "h264_v4l2m2m"

before line: https://github.com/zaps166/QMPlay2/blob/3fd30853a0dc1ac89b0a70895e9e9e6b47f21dde/src/modules/FFmpeg/FFDec.cpp#L73

The best way is to add V4L2M2M API to QMPlay2 directly (don't use FFmpeg).

zaps166 avatar Jan 01 '22 22:01 zaps166

There are many reasons why H264 videos cannot be played. As you yourself have noticed, ffmpeg V4L2M2M support is quite poor. Therefore, I would still like to see a gstreamer debug output of your environment. If I see it correctly, gst-play-1.0 binary is not included in your distribution packages. It is ultimately just a sample program, that comes with the gstreamer1-plugins-base source, to demonstrate how to program with the gstreamer framework. But gstreamer offers gst-launch-1.0. So let's create the video pipeline by hand. To do this, please execute this command: GST_DEBUG_DUMP_DOT_DIR=. gst-launch-1.0 playbin uri=file://${PWD}/test.mkv Repeat this also for non-working media files and it should give us an idea of the cause. However, implementing V4L2M2M API to QMPlay2 is the wrong approach here. It is the duty of the media framework to handle it. I just discovered how to gather some video playback statistics while playing online videos with my web browser. Falkon is a web browser based on QtWebEngine (which is itself based on the Chromium core, i.e., Blink) and the Qt framework. I still don't know which backend is used, but since I can play e.g. three videos in parallel flawless, I'm pretty sure the VPU will be used in my case. If I interpret this correctly, Qt uses gstreamer on Unix platforms as backend. Others use other things. So if applications that use MediaService plugins they do not need to know about the backends and receive corresponding support free of charge. Gstreamer framework can use FFmpeg codecs, vice versa it doesn't work as far as I know. At least otherwise I can't explain why ffmpeg framework based applications still lack functionalities that gstreamer could delivers.YouTube BTW, YouTube.png is the result of this code.

usual-user avatar Jan 02 '22 19:01 usual-user

Ok, I rebuilt QMPlay2 from master branch and applied this patch:

--- src/modules/FFmpeg/FFDec.cpp.orig   2022-01-02 21:35:22.000000000 +0100
+++ src/modules/FFmpeg/FFDec.cpp        2022-01-02 22:48:05.343435230 +0100
@@ -70,6 +70,13 @@ void FFDec::clearFrames()

 AVCodec *FFDec::init(StreamInfo &streamInfo)
 {
+    if (streamInfo.codec_name == "h264")
+        streamInfo.codec_name = "h264_v4l2m2m";
+    if (streamInfo.codec_name == "vp8")
+        streamInfo.codec_name = "vp8_v4l2m2m";
+    if (streamInfo.codec_name == "vp9")
+        streamInfo.codec_name = "vp9_v4l2m2m";
+
     AVCodec *codec = avcodec_find_decoder_by_name(streamInfo.codec_name);
     if (codec)
     {

Judging by the information window, the respective wrapper seems to be used. However, no video output is rendered. If your suggestion is applied verbatim, a missing ; will be claimed.

usual-user avatar Jan 02 '22 23:01 usual-user

If your suggestion is applied verbatim, a missing ; will be claimed.

True, my mistake.

However, no video output is rendered.

if you use "h264" does it output video? Do you use Vulkan or OpenGL? Is stream enabled, but no video (screenshot)? Does it work with ffplay and forcing v4l2 codec? Do you use h264 or other codecs, too (I can't test others, because R-Pi supports only H264 and MJPEG)?


I have gstreamer and it also didn't play on my test.mkv - only audio. I'll check different files later on gstreamer with dot graph!


I think I could try to add V4L2M2L - it looks for nice and simple API, with this I will be able to try zero-copy and check if FFmpeg has bugs in implementation or it's bug in R-Pi driver. Also one more API to learn :smile: If it will get complicated and I'll have a lot of problems, I'll stop with implementation (there is no PC using it, only ARM boards so far, unfortunately). I hope V4L2M2M will be used by Intel, AMD and finally - NVIDIA.

zaps166 avatar Jan 02 '22 23:01 zaps166

if you use "h264" does it output video? Sound is playing, no video rendered in Video window. QMplay2 Do you use Vulkan or OpenGL?

$ glxinfo -B
name of display: :0
display: :0  screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
    Vendor: Panfrost (0xffffffff)
    Device: Mali-T860 (Panfrost) (0xffffffff)
    Version: 21.3.0
    Accelerated: yes
    Video memory: 3834MB
    Unified memory: yes
    Preferred profile: core (0x1)
    Max core profile version: 3.1
    Max compat profile version: 3.1
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.1
OpenGL vendor string: Panfrost
OpenGL renderer string: Mali-T860 (Panfrost)
OpenGL core profile version string: 3.1 Mesa 21.3.0
OpenGL core profile shading language version string: 1.40
OpenGL core profile context flags: (none)

OpenGL version string: 3.1 Mesa 21.3.0
OpenGL shading language version string: 1.40
OpenGL context flags: (none)

OpenGL ES profile version string: OpenGL ES 3.1 Mesa 21.3.0
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.10

$

On plasma desktop with Wayland backend. Does it work with ffplay and forcing v4l2 codec?

$ ffplay -codec:v h264_v4l2m2m bbb_sunflower_2160p_30fps_normal.mp4
ffplay version 4.4 Copyright (c) 2003-2021 the FFmpeg developers
  built with gcc 11 (GCC)
  configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --docdir=/usr/share/doc/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=aarch64 --optflags='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -mbranch-protection=standard -fasynchronous-unwind-tables -fstack-clash-protection' --extra-ldflags='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld ' --extra-cflags=' -I/usr/include/rav1e' --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc --enable-version3 --enable-bzlib --disable-crystalhd --enable-fontconfig --enable-frei0r --enable-gcrypt --enable-gnutls --enable-ladspa --enable-libaom --enable-libdav1d --enable-libass --enable-libbluray --enable-libcdio --enable-libdrm --enable-libjack --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-nvenc --enable-openal --enable-opencl --enable-opengl --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librav1e --enable-libsmbclient --enable-version3 --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libvorbis --enable-libudev --enable-v4l2-request --enable-libv4l2 --enable-libvidstab --enable-libvpx --enable-vulkan --enable-libglslang --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libxml2 --enable-libzimg --enable-libzvbi --enable-lv2 --enable-avfilter --enable-avresample --enable-libmodplug --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-lto
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'bbb_sunflower_2160p_30fps_normal.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 1
    compatible_brands: isomavc1
    creation_time   : 2013-12-18T14:43:04.000000Z
    title           : Big Buck Bunny, Sunflower version
    artist          : Blender Foundation 2008, Janus Bager Kristensen 2013
    comment         : Creative Commons Attribution 3.0 - http://bbb3d.renderfarming.net
    genre           : Animation
    composer        : Sacha Goedegebure
  Duration: 00:10:34.53, start: 0.000000, bitrate: 7980 kb/s
  Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 3840x2160 [SAR 1:1 DAR 16:9], 7498 kb/s, 30 fps, 30 tbr, 30k tbn, 60 tbc
 (default)
    Metadata:
      creation_time   : 2013-12-18T14:43:04.000000Z
      handler_name    : GPAC ISO Video Handler
      vendor_id       : [0][0][0][0]
  Stream #0:1(und): Audio: mp3 (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 160 kb/s (default)
    Metadata:
      creation_time   : 2013-12-18T14:43:06.000000Z
      handler_name    : GPAC ISO Audio Handler
      vendor_id       : [0][0][0][0]
  Stream #0:2(und): Audio: ac3 (ac-3 / 0x332D6361), 48000 Hz, 5.1(side), fltp, 320 kb/s (default)
    Metadata:
      creation_time   : 2013-12-18T14:43:06.000000Z
      handler_name    : GPAC ISO Audio Handler
      vendor_id       : [0][0][0][0]
    Side data:
      audio service type: main
[h264_v4l2m2m @ 0xffff64030300] Could not find a valid device
[h264_v4l2m2m @ 0xffff64030300] can't configure decoder
  44.06 M-A:  0.000 fd=   0 aq=   20KB vq=    0KB sq=    0B f=0/0
$

ffplay Do you use h264 or other codecs, too I have the three I've included in my patch.

usual-user avatar Jan 03 '22 05:01 usual-user

Thanks. About screenshot - I thought about full Information panel with video information, but I can see it on ffplay (Could not find a valid device).

It looks like ffmpeg can't find your hardware video decoder...

I have the three I've included in my patch.

Yes, but did you test video files with other codecs, too? Or only h264 bbb_sunflower_2160p_30fps_normal.mp4?

On plasma desktop with Wayland backend.

Vulkan or OpenGL In QMPlay2 (not glxinfo), scroll down text on information panel. As example, on my x86-64 PC I have:

Screenshot_20220103_112439

zaps166 avatar Jan 03 '22 10:01 zaps166

It looks like ffmpeg can't find your hardware video decoder...

But mpv can and works flawless for me. I see that an mpv package is available. Maybe you should also make some attempts on your platform with it.

Yes, but did you test video files with other codecs, too?

Yes, several encoded formats:

Title: Big Buck Bunny, Sunflower version
Artist: Blender Foundation 2008, Janus Bager Kristensen 2013
Genre: Animation
Comment: Creative Commons Attribution 3.0 - http://bbb3d.renderfarming.net

File path: /home/plasma/workbench/video/
File name: bbb_sunflower_1080p_60fps_normal.mp4
Bitrate: 4486kbps
Format: mov,mp4,m4a,3gp,3g2,mj2
Video streams:
Stream 1
codec: h264_v4l2m2m
size: 1920x1080
aspect ratio: 1.77778
FPS: 60
bitrate: 4001kbps
format: yuv420p
Audio streams:
Stream 1 - FFmpeg, PipeWire
codec: mp3float
sample rate: 48000Hz
channels: stereo
bitrate: 160kbps
format: fltp
Stream 2
codec: ac3
sample rate: 48000Hz
channels: 6
bitrate: 320kbps
format: fltp

Playback starts, no video rendering.

File path: /home/plasma/workbench/video/
File name: bbb-1280x720-cfg02-hevc.mkv
Bitrate: 1665kbps
Format: matroska,webm
Video streams:
Stream 1 - FFmpeg, OpenGL 3.1 (render-to-texture)
codec: hevc
size: 1280x720
aspect ratio: 1.77778
FPS: 60
format: yuv420p
language: english
Audio streams:
Stream 1 - FFmpeg, PipeWire
codec: aac
sample rate: 48000Hz
channels: 6
format: fltp

Playback starts, video renders (softwaredecoder).

Title: Big Buck Bunny, Sunflower version

File path: /home/plasma/workbench/video/
File name: Big_Buck_Bunny_4K.webm.1080p.vp9.webm
Bitrate: 3787kbps
Format: matroska,webm
Video streams:
Stream 1
codec: vp9_v4l2m2m
size: 1920x1080
aspect ratio: 1.77778
FPS: 60
format: yuv420p
Audio streams:
Stream 1 - FFmpeg, PipeWire
codec: opus
sample rate: 48000Hz
channels: 6
format: fltp

Playback starts, no video rendering.

File path: /home/plasma/workbench/video/
File name: Big_Buck_Bunny_1080_10s_30MB.vp8.webm

Playback does not start.

Vulkan or OpenGL In QMPlay2 (not glxinfo)

Oh, I missunderstood, I thought you wanted to know what graphics stack my platform offers. Video decoding via GPU interface is exactly what will be avoided in the SOC world. VA-API, VDPAU, CUVID, NVDEC provide acceleration via the GPU interface. Usual concept in the x86-64 PC world, usually not available in the SOC world. SOCs have standalone VPUs for video decoding with dedicated interfaces, hence my confusion.

usual-user avatar Jan 03 '22 20:01 usual-user