AMDColorTweaks
AMDColorTweaks copied to clipboard
Applying an ICC profile forces sRGB TRC
Whichever ICC profile I apply with AMD Color Tweaks, the TRC is always forced to sRGB, even when creating the ICC within DisplayCAL a different tone curve has been selected (e.g. Gamma 2.2).
I've attached an example screenshot, this was a measurement of an ICC profile with gamma 2.2 tone curve applied via AMD Color Tweaks. The measurement clearly indicates a sRGB tone curve instead of the intended 2.2 tone curve.
If possible, it would be desirable if AMD Color Tweaks either respects the TRC set by the ICC profile, or provide preset configurations for applying TRCs other than sRGB (such as gamma 2.2 or 2.4), similar to what the novideo_srgb program provides.
@ohayoubaka, have you managed to figure that out?
I experience the similar-ish problem: it looks like gamma is being applied inverted "upside down"? I created 4 synthetic sRGB ICC profiles using DisplayCAL (default settings except TRC or tone curve): with sRGB TRC, with gamma 2.2 TRC, with gamma 2.4 TRC, with gamma 2.6 TRC. Once I apply ICC with sRGB TRC with AMDColorTweaks, the display output is clamped to sRGB successfully. However when I apply profile with gamma 2.2 TRC the display output becomes lighter overall, profile with gamma 2.4 TRC — even lighter display output, profile with gamma 2.6 TRC — yes, even more lighter output. AFAIK it should do the opposite: darker shades should become darker and contrast should increase the more the gamma is used.
So is the AMDColorTweaks broken or it is me who does something wrong? I use AMD Software: Adrenalin Edition 23.9.1.
@ohayoubaka, have you managed to figure that out?
I experience the similar-ish problem: it looks like gamma is being applied inverted "upside down"? I created 4 synthetic sRGB ICC profiles using DisplayCAL (default settings except TRC or tone curve): with sRGB TRC, with gamma 2.2 TRC, with gamma 2.4 TRC, with gamma 2.6 TRC. Once I apply ICC with sRGB TRC with AMDColorTweaks, the display output is clamped to sRGB successfully. However when I apply profile with gamma 2.2 TRC the display output becomes lighter overall, profile with gamma 2.4 TRC — even lighter display output, profile with gamma 2.6 TRC — yes, even more lighter output. AFAIK it should do the opposite: darker shades should become darker and contrast should increase the more the gamma is used.
So is the AMDColorTweaks broken or it is me who does something wrong? I use AMD Software: Adrenalin Edition 23.9.1.
Nope, I went over to using dwm_lut instead (the updated fork version) to get a 2.2 gamma TRC. Does come with a slight performance penalty if you do anything GPU intensive, but I guess that's the best option for now...
Thank you for prompt response @ohayoubaka, I appreciate. I spent the whole weekend (3 days) trying to get AMDColorTweaks to work before asking for some help. I have tried rolling back drivers as far as AMD Software: Adrenalin Edition 22.5.2 and it didn't work at all.
dwm_lut appears to run on my system correctly. Since I don't have colorimeter, may I ask you to verify my 3D LUT creation process described below? There's not much guidance. I believe some other dwm_lut users may find it helpful as well.
I've got wide gamut display (75% BT.2020, 99,88% DCI-P3, 100% sRGB coverage) with a hardware gamma 2.2. I used DisplayCAL → File → Create profile from EDID… to get the EDID ICC profile file (screenshot). I also have a calibrated Display P3 color space profile of my laptop that I can use (but I don't have a colorimeter). I'd like to clamp my display to 2 separate modes via 3D LUT: sRGB and Display P3.
This is what I do to generate 3D LUT using displaycal-3dlut-maker.exe to clamp display to sRGB or DisplayP3 color space:
- Source profile: sRGB IEC61966-2.1 or DisplayP3 color profile (from drop-down menu)
- Tone curve: Unmodified
- Abstract ("Look") profile: Unchecked
- Destination profile: <Select display's EDID profile ICC created with DisplayCAL (or a display's calibrated profile ICC)>
- Gamut mapping mode: Inverse device-to-PCS
- Rendering intent: Relative colorimetric (to preserve display's white point)
- 3D LUT file format: IRIDAS (.cube)
- Input encoding: Full range RGB 0-255
- Output encoding: Full range RGB 0-255
- 3D LUT resolution: 65×65×65
I then disable Custom Color in AMD Software control panel and apply 3D LUT via dwm_lut. The result passes sRGB vs gamma 2.2 vs gamma 2.4 test posted at ACES forum and another test posted at Shadertoy. The image output difference to me appears to be correct.
It is also possible to preserve display's EDID white point by doing the following (source 1, 2, 3).
- Create a synthetic profile with "correct" white point using displaycal-synthprofile.exe with following input:
- Preset: sRGB (for sRGB color space clamp) / DCI P3 D65 (for Display P3 color space clamp)
- Change White point XYZ to your display's values (drag'n'drop your display's ICC file on displaycal-profile-info.exe and look for Media white point → Illuminant-relative XYZ values)
- Press Chromatic adaptation → Bradford ICC → Apply
- White level: 80 cd/m²
- Black level: 0 cd/m²
- Tone curve: sRGB
- Profile class: Display device profile
- Technology: Unspecified
- Colorimetric image state: Unspecified
- Then using this new synthetic ICC as Source profile for displaycal-3dlut-maker.exe
- Changing Rendering intent to Absolute colorimetric with white point scaling in displaycal-3dlut-maker.exe before creating 3D LUT (all other settings should be left as stated above)
P.S. It's worth mentioning that according to my research AMD Software sRGB emulation doesn't pass both tests linked above: the actual display output results in gamma 2.2 instead of sRGB (but the tests are passed on the same machine with dwm_lut and 3D LUT for sRGB color space clamp, as well as passed for gamma 2.2 and 2.4 if the appropriate Gamma value was specified when 3D LUT was created).
Similar to the MHC2 pipeline^1, the source transfer is fixed to sRGB and is not programmable by user. Applications should output in sRGB transfer function for accurate gamut mapping.
@dantmnf, I've got some suspicions regarding source function being true sRGB. See, according to tests linked in my previous post — sRGB vs gamma 2.2 vs gamma 2.4 test posted at ACES forum and another test posted at Shadertoy — the Gamma 2.2 (it's hardware gamma of my display as specified in EDID) is being output if sRGB clamp ICC is applied via AMDColorTweaks, or, for that matter, if a MCH2 CSC sRGB clamp profile version is applied via Advanced display properties. The output between AMDColorTweaks & MCH2 CSC is similar (and it's similar to sRGB emulation via AMD Software control panel), however it's different to actual sRGB visual output produced by dwm_lut, that passes tests (3D LUT is generated via high-precision method involving synthetic profile as described in my previous post). And AMDColorTweaks visual output is similar to dwm_lut output when Gamma 2.2 tone curve is used when 3D LUT is generated.
Would you kindly double-check if you can pass tests with AMDColorTweaks? I looked through the AMDColorTweaks code and noticed it uses driver as source of primaries, white points, etc. so I understand there may be some sort of error on their end as on my end the sRGB emulation via AMD Software control panel doesn't pass tests too (it outputs as Gamma 2.2).
P.P.S. In case you'd like to double-check result via AMD Software control panel and your Custom Color and Color Temperature Control aren't working after running AMDColorTweaks, please see this message for a fix that requires no reinstalling/resetting of the driver.
In my previous tests, both AMDColorTweaks and MHC CSC profile gave excellent results on sRGB proofing verification. As per my previous comment, you should set sRGB profile (both gamut and transfer) as the verification target.
With 3D LUT or novideo_srgb, you can reinterpret input as gamma 2.2. But this is not the case in AMDColorTweaks and MHC pipeline, in which content is degammaed with sRGB transfer function before we can touch it.
It is also theoreitcally possible to scale output to gamma 2.2, with penalty of some acceptable errors when proofing sRGB gamut on P3 display (https://github.com/dantmnf/MHC2/pull/16#issuecomment-1922361191). But keep in mind that the error will become larger as the display gamut enlarges.
@dantmnf, thank you for prompt response! I'm sorry, you're correct: I figured out I can actually pass sRGB test from ACES forum using AMDColorTweaks if I force my display's ICC's white point the same both for Source and Destination (this kind of corresponds to 3D LUT creation process described by me previously). Unfortunately I don't have a colorimeter for verification. Would you kindly tell if this is the intended way to set wide color gamut display clamp to Display P3 color space?
I use similar approach for clamping to sRGB color space (except I select Source Primaries via drop-down). Please note that Color LCD.icc is my display's profile which has a unique white point and for Destination I simply loaded the Color LCD.icc. I used DisplayCAL → File → Create profile from EDID… to get the EDID ICC profile file (screenshot). I also have a calibrated profile (white point is very close to my factory EDID number), but I do my tests with the ICC generated by DisplayCAL from EDID to keep it simple (luckily factory calibration on my display is on point).
P.S. As it goes to Shadertoy sRGB test which can't be passed with AMDColorTweaks setup presented on screenshot — it's not a big deal, because this test warns about some incompatibility that may take place, and I can't pass it with AMD Software sRGB emulation anyway.
Would you kindly tell if this is the intended way to set wide color gamut display clamp to Display P3 color space
For completeness, you may also need an ICC profile with such primaries and sRGB TRCs to hint ICC-aware applications.
As it goes to Shadertoy sRGB test which can't be passed with AMDColorTweaks setup presented on screenshot
This test uses colored halftone, it is unlikely to pass without a colorimeter, especially when the screen is aged and off from factory calibration.
@dantmnf, thank you so much for taking time to examine the setup and explain.
Similar to the MHC2 pipeline1, the source transfer is fixed to sRGB and is not programmable by user. Applications should output in sRGB transfer function for accurate gamut mapping.
According to this and to your response in the MCH2 issue you linked, it sounds like there is no actual "real" purpose (except testing and research) for editing Transfer in Destination of AMDColorTweaks if the general goal is just to clamp gamut output to different color space keeping TRC as sRGB (as it can't be changed because of Windows CMM limitation just as MCH2). But what about "reverting" the Source transfer from sRGB back to display's native hardware transfer (e.g. Gamma 2.2 in case of my display's EDID and most consumer-grade devices) by using sRGB TRC values in Parametric EOTF dialog box (or sRGB LUT CSV in Channels LUT dialog box)?
See, I can pass both ACES (grayscale) and Shadertoy (color) tests as Gamma 2.2 output if I setup AMDColorTweaks for sRGB clamp via my display's ICC and Sources drop-down and then input A0, A1, A2, A3, g as 0.0031308, 12.92, 0.055, 0.055, 2.4 (which is sRGB EOTF values, or I can load it from a CSV generated by AMDColorTweaks from official sRGB2014.icc), I can also see that darker shades become darker. Both tests (grayscale & color) verify as Gamma 2.2 100% pixel-perfect, it's 100% accurate to visual output in "pure display gamma default mode" (which is activated if you hit Reset in AMDColorTweaks). This observation and "inconsistency" of output result (as described in my response in detail) kind of corresponds to this figure (source).
I understand that the math may be much more complicated and calculating a correct correction curve from sRGB Source back to Gamma 2.2 in Destination probably is harder that just applying sRGB LUT (or sRGB curve's A0—g values). But as the result passes Gamma 2.2 color test and is visually indistinguishable from "reset" one, it may be true that you only need to apply sRGB LUT or curve to "cancel" sRGB EOTF back to Gamma 2.2 of display! Do you happen to have an idea if applying sRGB Parametric EOTF A0—g values (or by loading sRGB LUT) in Destination → Transfer → Edit dialog window, according to code code, API or sRGB EOTF actually simply takes away ("reverts") fixed sRGB TRC of Source, essentially allowing display to display at it's native untouched hardware EOTF (specified in EDID)?
P.S. I believe that my initial confusion may be due to mixing up ICC TRC with display's hardware EOTF (it looks like it's a common confusion when it goes to sRGB, even ACES forum people struggle).
This test uses colored halftone, it is unlikely to pass without a colorimeter, especially when the screen is aged and off from factory calibration.
Upon further investigation, I came to conclusion the reason of failing the Shadertoy's sRGB vs Gamma 2.2 test with AMDColorTweaks and AMD Software sRGB clamp of WCG display is not related to any of that. See, the very same display (2 years old, 99,9% verified to factory calibration) passes the test as sRGB when dwm_lut with 3D LUT sRGB clamp is applied to display's unrestricted wide color gamut mode. I'm absolutely sure you can't pass the test with AMDColorTweaks, AMD Software sRGB emulation or MHC2 ICC profile (made with your tool) because they all are using 1D LUT (or maybe 3×1D LUT at best). Since 1D LUT is grayscale only, you totally need 3D LUT to pass color sRGB test and comply to sRGB emulation. Here's a brief explanation by Florian Höch (DisplayCAL developer) that makes perfect sense.
I'm really looking forward to the 3D LUT API implementation you teased in issue #9! This will be huge!
To clarify: I CAN pass this colored halftone test with an MHC sRGB proofing profile, that is created from a freshly calibrated profile from DisplayCAL.
As of current AMDColorTweaks, there is another layer of precision issue: channel LUT input is 8-bit linear RGB, which significantly reduces the number of available output steps:
In [10]: def srgb_oetf(x):
...: return np.where(x <= 0.0031308, x * 12.92, 1.055 * (x ** (1 / 2.4)) - 0.055)
...:
In [11]: np.round(srgb_oetf(np.arange(256)/255.0) * 255.0)
Out[11]:
array([ 0., 13., 22., 28., 34., 38., 42., 46., 50., 53., 56.,
59., 61., 64., 66., 69., 71., 73., 75., 77., 79., 81.,
83., 85., 86., 88., 90., 92., 93., 95., 96., 98., 99.,
101., 102., 104., 105., 106., 108., 109., 110., 112., 113., 114.,
115., 117., 118., 119., 120., 121., 122., 124., 125., 126., 127.,
128., 129., 130., 131., 132., 133., 134., 135., 136., 137., 138.,
139., 140., 141., 142., 143., 144., 145., 146., 147., 148., 148.,
149., 150., 151., 152., 153., 154., 155., 155., 156., 157., 158.,
159., 159., 160., 161., 162., 163., 163., 164., 165., 166., 167.,
167., 168., 169., 170., 170., 171., 172., 173., 173., 174., 175.,
175., 176., 177., 178., 178., 179., 180., 180., 181., 182., 182.,
183., 184., 185., 185., 186., 187., 187., 188., 189., 189., 190.,
190., 191., 192., 192., 193., 194., 194., 195., 196., 196., 197.,
197., 198., 199., 199., 200., 200., 201., 202., 202., 203., 203.,
204., 205., 205., 206., 206., 207., 208., 208., 209., 209., 210.,
210., 211., 212., 212., 213., 213., 214., 214., 215., 215., 216.,
216., 217., 218., 218., 219., 219., 220., 220., 221., 221., 222.,
222., 223., 223., 224., 224., 225., 226., 226., 227., 227., 228.,
228., 229., 229., 230., 230., 231., 231., 232., 232., 233., 233.,
234., 234., 235., 235., 236., 236., 237., 237., 238., 238., 238.,
239., 239., 240., 240., 241., 241., 242., 242., 243., 243., 244.,
244., 245., 245., 246., 246., 246., 247., 247., 248., 248., 249.,
249., 250., 250., 251., 251., 251., 252., 252., 253., 253., 254.,
254., 255., 255.])
In [12]: len(set(_))
Out[12]: 183
It may have interpolation, but I don't think so when comparing outputs with MHC pipeline.
TLDR: It is all about precession issue and dwm_lut has better processing precision and output dithering.
AMD Software sRGB emulation or MHC2 ICC profile (made with your tool) because they all are using 1D LUT (or maybe 3×1D LUT at best). Since 1D LUT is grayscale only, you totally need 3D LUT to pass color sRGB test and comply to sRGB emulation
They all use matrix transform in linear space for volumetric conversion. Assuming the three output channels are independent of each other (which is not always true), and given programmable pre- and post-shaper, it will be mathematically equivalent to a simple gamut mapping (clipping) 3D LUT.
allowing display to display at it's native untouched hardware EOTF (specified in EDID)?
No, unless your hardware EOTF is linear (gamma 1.0).