mpv
mpv copied to clipboard
drm: add --drm-send-hdr-meta option to send HDR metadata
Based on the patch by medvedvvs. Changes to medvedevvs' patch:
- flattened to a single commit;
- splice out libdrm struct definitions to
osdep/drm-hdr-metadata.hand make them conditional on the version of libdrm; - calculate the peak level correctly;
- round and clamp the various metadata values properly;
- switch back to SDR on exit;
- add a man page entry;
- remove a potential resource leak in the error path;
- fix indentation.
Thanks for doing this. I can't comment about any of the particulars of HDR, but the patch does appear to work on my setup (kernel 5.11.6 and amdgpu). Just a couple of nitpicks:
- The commit message refers to "HDMI" as well as several places in the code/manual entry, but this doesn't have anything to do with HDMI in particular. My setup is displayport and of course it works here too.
- The
drm-send-hdr-metaoption defaults tononotauto. That's kind of intuitive and defeats the purpose of having anautooption in the first place.
* The commit message refers to "HDMI" as well as several places in the code/manual entry, but this doesn't have anything to do with HDMI in particular. My setup is displayport and of course it works here too.
The word āHDMIā is frequently used in both the original patch and the relevant libdrm code, so I thought it was something HDMI specific. Happy to hear that it works over DP as well. I've removed mentions of HDMI from the manpage entry and the git commit message.
* The `drm-send-hdr-meta` option defaults to `no` not `auto`. That's kind of intuitive and defeats the purpose of having an `auto` option in the first place.
I took the auto to refer to the EOTF selection more than that it enables HDR (though in reality it does both). If people feel confident about enabling this by default I'll happily change it. I expect it won't impact too many people since this only affects the DRM gpu-context.
It seems to enable and disable HDR correctly on my LG C8. However:
- ~~The metadata sent isn't actually changing how the TV receives it, the TV seems to use the same tonemapping curves regardless of the source metadata.~~
- It seems to rely on the
--target-peakvalue instead of actual file metadata, which makes sense. But how do we "passthrough" then? Without tonemapping at all?
- It seems to rely on the
- The levels are always sent as "limited" even with
--video-output-levels=full. And when passing limited, the levels are processed twice making them wrong. - The very first playback (after boot) doesn't enable HDR without restarting mpv.
It seems to enable and disable HDR correctly on my LG C8. However:
Ah, you have the same TV as me (I tested this with my OLED55C8PLA). In that case I'm afraid I have bad news for you:ā the C8 firmware has a bad flaw in the code path where it processes HDR RGB input. It seems to downgrade it to 8-bit YCbCr which horribly quantizes the chroma. At least that's what I observed. If your experience was different I'd be delighted to learn how to circumvent that.
* ~The metadata sent isn't actually changing how the TV receives it, the TV seems to use the same tonemapping curves regardless of the source metadata.~
This is a limitation of the current implementation. It converts to the selected EOTF but with preset parameters. The file metadata is presumably used in that conversion but what is sent to the TV is always the same curve.
* It seems to rely on the `--target-peak` value instead of actual file metadata, which makes sense. But how do we "passthrough" then? Without tonemapping at all?
I'm not entirely sure yet how --target-peak factors into this at all. I got incorrect (crushed) output when I set it to less than or more than 203 so I just figured I had to keep it at 203 in all cases.
* The levels are always sent as "limited" even with `--video-output-levels=full`. And when passing limited, the levels are processed twice making them wrong.
That would be a bug in the aforementioned EOTF conversion. I haven't noticed limited levels, black always seemed black to me.
* The very first playback (after boot) doesn't enable HDR without restarting mpv.
That's a bit surprising since the current implementation adds HDR metadata to every frame. Can you debug?
I'm not entirely sure yet how --target-peak factors into this at all. I got incorrect (crushed) output when I set it to less than or more than 203 so I just figured I had to keep it at 203 in all cases.
In the code that might be sig_peak? I have decent results with this config:
--target-trc=pq --target-peak=870 --video-output-levels=full --target-prim=dci-p3 --hdr-compute-peak=yes --gamut-clipping=no --tone-mapping=mobius --vo=gpu --gpu-api=opengl --gpu-context=drm --drm-send-hdr-meta=auto --drm-connector=HDMI-A-1
However the levels aren't correct without setting the TV to limited, and --hdr-compute-peak is overall very dim compared to what I'd expect, although it does tonemap down all the detail.
Maybe you have the TV set to receive limited already?
The first HDR patch had a similar issue, where the TV would only switch after resetting the input. Before, that was turning it on/off. But now it just doesn't seem to switch at the very first HDR metadata sent. Here's the log for the first run where HDR isn't triggered: hdr.txt And after rerunning the same CLI: hdr2.txt No idea if they have anything useful.
As for RGB output, I haven't looked at mpv to see if it's possible. In Windows, mpv also has horrible banding with HDR.
In the logs, I can notice:
[out] 3840x2160 yuv420p10 bt.2020-ncl/bt.2020/pq/limited/display SP=49.261086 CL=mpeg2/4/h264
Which makes it sound like the levels aren't being changed at all.
I'm not entirely sure yet how --target-peak factors into this at all. I got incorrect (crushed) output when I set it to less than or more than 203 so I just figured I had to keep it at 203 in all cases.
In the code that might be
sig_peak? I have decent results with this config:
sig_peak seems to be the peak used to convert to when mpv converts to pq/hlg.
--target-trc=pq --target-peak=870 --video-output-levels=full --target-prim=dci-p3 --hdr-compute-peak=yes --gamut-clipping=no --tone-mapping=mobius --vo=gpu --gpu-api=opengl --gpu-context=drm --drm-send-hdr-meta=auto --drm-connector=HDMI-A-1
However the levels aren't correct without setting the TV to limited, and
--hdr-compute-peakis overall very dim compared to what I'd expect, although it does tonemap down all the detail.
Ah, I never use --hdr-compute-peak, the effect annoys me and it should not be necessary on a HDR output anyway.
Maybe you have the TV set to receive limited already?
Actually I think you're right, my brain just hadn't linked LG's āBlack Levelā name with limited RGB. Yes, it's set to receive limited input.
The first HDR patch had a similar issue, where the TV would only switch after resetting the input. Before, that was turning it on/off. But now it just doesn't seem to switch at the very first HDR metadata sent. Here's the log for the first run where HDR isn't triggered: hdr.txt And after rerunning the same CLI: hdr2.txt No idea if they have anything useful.
Those two files seem to be identical except for you hitting the pan-scan key in the second one.
As for RGB output, I haven't looked at mpv to see if it's possible. In Windows, mpv also has horrible banding with HDR.
Not sure what you wonder is āpossibleā; mpv's drm context can only output RGB (8-bit or 10-bit). Having support for NV30 (10-bit 4:4:4) would be great but is not trivially implemented I'm afraid.
Are you on IRC? There's a way to get HDR on a C8 without metadata signalling, but that discussion is out of scope for this PR.
So since the log doesn't help, not too sure what's happening with the first HDR signal. I guess otherwise technically it works as a first implementation.
Are you on IRC? There's a way to get HDR on a C8 without metadata signalling, but that discussion is out of scope for this PR.
I'm just trying to help get working HDR on Linux, but for actual watching I simply use madVR in a passthrough VM.
Had a look at dmesg output..
First playback at boot:
Playback start? No HDR triggered
[Mon Mar 15 12:54:23 2021] HDR SB:41 02 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:54:23 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
[Mon Mar 15 12:54:23 2021] HDR SB:41 02 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:54:23 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
Playback stopped after 9s
[Mon Mar 15 12:54:33 2021] HDR SB:43 00 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:54:33 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
[Mon Mar 15 12:54:33 2021] HDR SB:43 00 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:54:33 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
Second playback
Playback start?
[Mon Mar 15 12:56:57 2021] HDR SB:43 00 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:56:57 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
??? HDR Triggered
[Mon Mar 15 12:57:00 2021] HDR SB:41 02 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:57:00 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
[Mon Mar 15 12:57:00 2021] HDR SB:41 02 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:57:00 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
Playback stopped after 6s
[Mon Mar 15 12:57:06 2021] HDR SB:43 00 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:57:06 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
[Mon Mar 15 12:57:06 2021] HDR SB:43 00 00 d0 84 80 3e c2 33 c4 86 4c 1d b8 0b 54
[Mon Mar 15 12:57:06 2021] HDR SB:3d 8e 44 66 03 00 00 66 03 66 03 00 00 00 00 00
Anything meaningful?
Tried testing this out myself with
mpv --no-config --gpu-api=opengl --gpu-context=drm --drm-send-hdr-meta=auto --drm-format=xrgb2101010 --target-trc=pq $file
A couple of observations:
- very, very heavy banding unless
--drm-formatis set to 10bit. Does not seem to occur with--drm-send-hdr-meta=noand identical settings. - HDR does indeed "deactivate" on closing the stream unlike the previous PR
- The quality of the HDR is quite suspect, as highlights blow out any detail in bright areas, but this may be my display being shit, I don't think it's good HDR
--hdr-compute-peakdoesn't seem to change anything for me but it looks like that could be because my display sucks
Overall it seems to work well for me, especially as a first cut. I do not have any other HDR displays to test so I cannot rule out that my issues are caused by my monitor. Even using a windows machine and setting HDR it just looks completely oversaturated. However that's also a completely different HDR implementaiton so it's not a very scientific test.
Edit: adding --target-peak=203 helps with the blown out highlights and saturation.
How is it going now?
With the --target-trc=pq and --drm-send-hdr-meta it makes my TV show its "HDR Enabled" overlay and definitely does something to the display, but I can't see anything interesting in the HDMI diagnostic screen (not that I'd know what exactly to look for anyway)
That said the colors of the HDR version of Your Name viewed with these flags looks a lot more like the SDR version than it does without them (looks noticeably more muted than the SDR version without this patch).
So it isn't going to be merged into master branch any time soon?
A test sample where CLL SEI changes as scene changes to test part of the standard that allows to change static metadata. MDCV does not change though, at least here.
https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/uploads/16c628c535865d7282a48317064345a2/out.mp4
sig_peakseems to be the peak used to convert to when mpv converts to pq/hlg.
Yeah this is this commit https://github.com/mpv-player/mpv/commit/ef6bc8504a945eb6492b8ed46fd5a1afaaf32182 and is from ITU-R report BT.2408. Please note that this is only when you render SDR on 1000 nits HDR canvas. For different (like 400 nits HDR canvas) different values should be used. For SDR the standard of scene reffered is 100 nits, but other values lile 120, 160 and others are also in use, depending on how "black" is black. See EOTF BT.1886. For sRGB it is 80 nits, since it is display reffered, i.e. an EOTF in by itself.
- very, very heavy banding unless
--drm-formatis set to 10bit.
No surprises here. 8 bit HDR video is very disgusting looking, since part of those 8 bits are used for luminance and 8 bit is just added to 10 bit like BT.709 tells you to. Even with Display HDR that use 8 bit monitors MUST still accept HDR10 format, that is 10 bit. Then it is processed on monitor side and then presented on 8 bit display. That is good looking compared to insane banding of 8 bit HDR file. See for yourself. https://disk.yandex.ru/i/ZKd0INUrpHtoDg
Also please note that mpv uses BT.2390 Hermite spline to present static metadata dynamically, but AMD already used BT.2390 in the drivers. So this is a problem of course.
https://patchwork.freedesktop.org/patch/396160/ https://patchwork.freedesktop.org/patch/255949/
Will it be merged into master branch any time soon?
Why BT.709 transfer does not turn off peak calculation if HDR10 metadata is present?? See sample here (NO, IT IS not PQ and NOT HDR, you can forse PQ it will look wrong, there is an actual PQ sample too):
https://4kmedia.org/sony-camping-in-nature-4k-demo/
Why BT.709 transfer does not turn off peak calculation if HDR10 metadata is present??
That seems like a bug to me. We should be stripping/overriding sig_peak for SDR signals. (Do we not?)
Edit: we do
Yeah, but warnings are still there.
That has nothing to do with HDR peak calculation. That warning is printed because the file contains broken metadata. It's literally an issue with the file, not mpv.
That has nothing to do with HDR peak calculation.
Okay then. I just thought that those warnings are extra calculations when they are not needed.
With the already noted exception that mpv needs to be run once and then ran again before the monitor enters HDR mode, this seems to be working for me with the following options:
--no-config --target-trc=pq --target-peak=400 --video-output-levels=full --target-prim=dci-p3 --hdr-compute-peak=yes --gamut-clipping=no --tone-mapping=mobius --vo=gpu --gpu-api=opengl --gpu-context=drm --drm-send-hdr-meta=auto --drm-connector=HDMI-A-0
I'm on an arch based system. linux 5.10. AMDGPU. Connected over HDMI to an Acer XV272U
Still not in master branch yet?
Hi,
if you're using drm with vulkan, libplacebo master now supports sending HDR metadata via Vulkan (on supported drivers). Unfortunately, as far as I can tell, neither ANV nor RADV implement this at all. AMDVLK does, but only for DRM. (As usual, I have no clue about nvidia)
It doesn't currently support the equivalent extensions for OpenGL but that support should be easy to add. In either case, with this in mind, it might be a better idea to focus your efforts onto implementing VK_EXT_hdr_metadata in mesa's DRM code, rather than in mpv.
That way it will Just Work(tm), and for more than just mpv.
This will only work if Vulkan or EGL is managing the KMS state, e.g. via VK_KHR_display. When mpv manages the KMS state itself, mpv needs to implement sending the HDR metadata via KMS.
Having support for NV30 (10-bit 4:4:4) would be great but is not trivially implemented I'm afraid.
What about P016?
https://patchwork.ffmpeg.org/project/ffmpeg/patch/[email protected]/
https://patchwork.ffmpeg.org/project/ffmpeg/patch/[email protected]/
Of course outputing 4:2:0 is imho a bad idea, since chroma sitiing of top-left cannot be signalled in HDMI and in files can be not top-left and even worse, BT.2020 and BT.2100 reference devices may be not doing top-left FIR at all!
--target-prim=dci-p3
This is wrong. No one uses dci-p3. All of it is display-p3 or actually p3-d65. Wow.
A quick addition to make it work in the meson build.
diff --git a/meson.build b/meson.build
index c39924a1ca..1704653383 100644
--- a/meson.build
+++ b/meson.build
@@ -946,6 +946,14 @@ if drm['use']
'video/out/vo_drm.c')
endif
+drm_hdr = get_option('drm-hdmi-hdr').require(
+ drm['use'] and drm['deps'].version().version_compare('>= 2.4.104'),
+ error_message: 'DRM HDMI HDR requirements not met.'
+)
+if drm_hdr.allowed()
+ features += 'drm-hdmi-hdr'
+endif
+
gbm = dependency('gbm', version: '>=17.1.0', required: get_option('gbm'))
if gbm.found()
dependencies += gbm
@@ -1733,6 +1741,7 @@ conf_data.set10('HAVE_D3D11', d3d11.allowed())
conf_data.set10('HAVE_DIRECT3D', direct3d)
conf_data.set10('HAVE_DOS_PATHS', win32)
conf_data.set10('HAVE_DRM', drm['use'])
+conf_data.set10('HAVE_DRM_HDMI_HDR', drm_hdr.allowed())
conf_data.set10('HAVE_DVBIN', dvbin.allowed())
conf_data.set10('HAVE_DVDNAV', dvdnav.found() and dvdread.found())
conf_data.set10('HAVE_EGL', egl['use'])
diff --git a/meson_options.txt b/meson_options.txt
index 4a755cfdd5..76a6763cc4 100644
--- a/meson_options.txt
+++ b/meson_options.txt
@@ -59,6 +59,7 @@ option('cocoa', type: 'feature', value: 'auto', description: 'Cocoa')
option('d3d11', type: 'feature', value: 'auto', description: 'Direct3D 11 video output')
option('direct3d', type: 'feature', value: 'auto', description: 'Direct3D support')
option('drm', type: 'feature', value: 'auto', description: 'DRM')
+option('drm-hdmi-hdr', type: 'feature', value: 'auto', description: 'DRM HDMI HDR')
option('egl', type: 'feature', value: 'auto', description: 'EGL 1.4')
option('egl-android', type: 'feature', value: 'auto', description: 'Android EGL support')
option('egl-angle', type: 'feature', value: 'auto', description: 'OpenGL ANGLE headers')
Could it be merged, please. I really want to see how it looks like.
Bug (probably missing a null check):
[vo/gpu/opengl] Initializing GPU context 'drm'
[vo/gpu] VT_GETMODE failed: Inappropriate ioctl for device
[vo/gpu/opengl] Failed to set up VT switcher. Terminal switching will be unavailable.
[vo/gpu/opengl] Initializing KMS
[vo/gpu/opengl] Driver: panfrost 1.2.0 (20180908)
[vo/gpu/opengl] Cannot retrieve DRM resources: Operation not supported
Thread 12 "mpv/vo" received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0xffffecc7ffc0 (LWP 7891)]
drm_destroy_hdrmeta (ctx=0x0) at ../video/out/drm_atomic.c:573
573 if (ctx->hdr_metadata.blob_id) {
(gdb) bt
#0 drm_destroy_hdrmeta (ctx=0x0) at ../video/out/drm_atomic.c:573
#1 0x0000aaaaaabdbca8 in kms_destroy (kms=0xffffd4002560) at ../video/out/drm_common.c:707
#2 0x0000aaaaaabdc5d0 in kms_destroy (kms=0xffffd4002560) at ../video/out/drm_common.c:705
#3 kms_create (log=, drm_device_path=, connector_spec=, mode_spec=0xaaaaaadf4620 "preferred", draw_plane=-1, drmprime_video_plane=-2,
use_atomic=) at ../video/out/drm_common.c:699
#4 0x0000aaaaaabf3944 in drm_egl_init (ctx=0xffffd4001850) at ../video/out/opengl/context_drm_egl.c:854
mpv should skip any device which doesn't pass drmIsKMS.
Good hint, thanks. I had been wondering how to get autodetection to work for that since card0 and card1 randomly swap places on this SBC.
Hi, does anybody know if it is going to be ready?
@laichiaheng it will be merged when it's ready, if you want to try it out you can merge the code yourself locally, there's no need to keep commenting on this issue
Is there a tracking issue that is closed by this PR?