An implementation bug in ACM/HDR
The readme says that Windows SDK headers suppose a gamma 2.2 transfer (OUTPUT_WIRE_COLOR_SPACE_G22_P709) but experiments show that assuming an sRGB transfer function gives better average delta-E on verification (this may vary on GPU vendor). This is possibly due to Windows uses piecewise sRGB transfer function for sRGB contents, and this behavior is completely wrong according to sRGB spec.
In IEC 61966-2-1-1999 chapter 4.1 "Reference image display system characteristics", the specification requires the reference display to be calibrated to gamma 2.2 tone curve when directly displaying sRGB signal. So power 2.2 gamma curve should be applied rather than sRGB curve when converting sRGB content to linear scRGB blending space to accurately follow the spec behaivor. And we can see some display problems caused by this issue: https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm
I also created an issue at windows feedback hub, please give me an approve ticket, this might be helpful to get this issue fixed: https://aka.ms/AAw09zz
Some related discussions about this: https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/30 https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/12 https://gitlab.freedesktop.org/pq/color-and-hdr/-/blob/main/doc/wayland_qa.md?ref_type=heads#q-should-srgb-content-be-decoded-with-the-piecewise-srgb-transfer-function
Btw, currently KDE Wayland + Linux would do the convert correctly for sRGB contents: https://discuss.kde.org/t/whats-the-transfer-function-for-sdr-content-under-hdr-mode/33259/3?u=stat_headcrabed
Also check https://github.com/ledoge/dwm_eotf
When reading IEC 61966-2-1-1999, it is very clear that sRGB content should be decoded using the piecewise functions rather than simulating the mismatch of the 2.2 power function reference display. I've shared my full thoughts on that matter, based on research of all the related standards documentation and literature, in a blog post.
The difference between 2.2 power function decoding and piecewise function decoding is definitely visible and affects the dark values of an image. I really don't want to dismiss this entirely. But my guess is that a lot of the users that much prefer the look of SDR instead of HDR in Windows are comparing sRGB SDR presented on a TV that is using a 2.4 power function (BT.1886) to decode the signal, which would give a dramatically higher contrast and saturation than sRGB decoded with either 2.2 power function (sRGB reference display) or piecewise function (Windows HDR decoding, as sRGB standard instructs to do).
(Also, thanks for sharing the sRGB standard doc. I had been looking for a PDF of this for a while!)
When reading IEC 61966-2-1-1999, it is very clear that sRGB content should be decoded using the piecewise functions rather than simulating the mismatch of the 2.2 power function reference display. I've shared my full thoughts on that matter, based on research of all the related standards documentation and literature, in a blog post.
The difference between 2.2 power function decoding and piecewise function decoding is definitely visible and affects the dark values of an image. I really don't want to dismiss this entirely. But my guess is that a lot of the users that much prefer the look of SDR instead of HDR in Windows are comparing sRGB SDR presented on a TV that is using a 2.4 power function (BT.1886) to decode the signal, which would give a dramatically higher contrast and saturation than sRGB decoded with either 2.2 power function (sRGB reference display) or piecewise function (Windows HDR decoding, as sRGB standard instructs to do).
(Also, thanks for sharing the sRGB standard doc. I had been looking for a PDF of this for a while!)
@allenwp Thanks for reply. But I still think we have to simulate 2.2 reference display. Most modern monitors are still calibrated to gamma 2.2 for their factory calibration. And windows also assumes monitors are calibrated to 2.2 when ACM is enabled. So changing current HDR/ACM implementation to 2.2 EOTF is the best way to get the most accurate display result.
When reading IEC 61966-2-1-1999, it is very clear that sRGB content should be decoded using the piecewise functions rather than simulating the mismatch of the 2.2 power function reference display. I've shared my full thoughts on that matter, based on research of all the related standards documentation and literature, in a blog post. The difference between 2.2 power function decoding and piecewise function decoding is definitely visible and affects the dark values of an image. I really don't want to dismiss this entirely. But my guess is that a lot of the users that much prefer the look of SDR instead of HDR in Windows are comparing sRGB SDR presented on a TV that is using a 2.4 power function (BT.1886) to decode the signal, which would give a dramatically higher contrast and saturation than sRGB decoded with either 2.2 power function (sRGB reference display) or piecewise function (Windows HDR decoding, as sRGB standard instructs to do). (Also, thanks for sharing the sRGB standard doc. I had been looking for a PDF of this for a while!)
@allenwp Thanks for reply. But I still think we have to simulate 2.2 reference display. Most modern monitors are still calibrated to gamma 2.2 for their factory calibration. And windows also assumes monitors are calibrated to 2.2 when ACM is enabled. So changing current HDR/ACM implementation to 2.2 EOTF is the best way to get the most accurate display result.
A large number of monitors use the sRGB EOTF instead of gamma 2.2, and Windows defaults your display device to the sRGB EOTF instead of gamma 2.2 when ACM is enabled.
Is there a section that outlines the reference display input output characteristic in the specification? Or perhaps a specific reference to the formula?
it is very clear that sRGB content should be decoded using the piecewise functions rather than simulating the mismatch of the 2.2 power function reference display.
The vast majority of displays are indeed a pure 2.2 EOTF, as that's the common hardware baseline. To try and suggest it is "very clear" is ahistorical, and doubly against the outlined function and reference conditions.
The history of sRGB is one riddled with political backroom fighting. The two part encoding function was not even authored by the primary author of the specification, and instead was a byproduct of another company insisting on including their encoding characteristic, due to IP, before being willing to back the specification.
A large number of monitors use the sRGB EOTF instead of gamma 2.2
If by large you mean every single Apple hardware by default uses a pure 2.2, or the vast majority of consumer displays, or otherwise, then yes. We could likely speculate that those displays that have a hardware based interpretation of the non-specification input output characteristic are so proportionately low quantity in normative contexts as to be equivalent to a statistical error.
As someone who actually spent several years trying to discuss this matter with the primary author, the best thing we can do is follow the specification. Look at the specification. Read it carefully. Note what "encoding" and "input output" characteristics are.
In 2025, there's no benefit beyond confusion in using two part functions.
When reading IEC 61966-2-1-1999, it is very clear that sRGB content should be decoded using the piecewise functions rather than simulating the mismatch of the 2.2 power function reference display. I've shared my full thoughts on that matter, based on research of all the related standards documentation and literature, in a blog post. The difference between 2.2 power function decoding and piecewise function decoding is definitely visible and affects the dark values of an image. I really don't want to dismiss this entirely. But my guess is that a lot of the users that much prefer the look of SDR instead of HDR in Windows are comparing sRGB SDR presented on a TV that is using a 2.4 power function (BT.1886) to decode the signal, which would give a dramatically higher contrast and saturation than sRGB decoded with either 2.2 power function (sRGB reference display) or piecewise function (Windows HDR decoding, as sRGB standard instructs to do). (Also, thanks for sharing the sRGB standard doc. I had been looking for a PDF of this for a while!)
@allenwp Thanks for reply. But I still think we have to simulate 2.2 reference display. Most modern monitors are still calibrated to gamma 2.2 for their factory calibration. And windows also assumes monitors are calibrated to 2.2 when ACM is enabled. So changing current HDR/ACM implementation to 2.2 EOTF is the best way to get the most accurate display result.
A large number of monitors use the sRGB EOTF instead of gamma 2.2, and Windows defaults your display device to the sRGB EOTF instead of gamma 2.2 when ACM is enabled.
@Malus-risus You can have a look at monitor reviews here and would find most monitors are factory calibrated to power 2.2: https://www.rtings.com/monitor
Windows defaults to power 2.2 EOTF for moniors according to last section here: https://github.com/dantmnf/MHC2/blob/master/README.md
Windows defaults to power 2.2 EOTF for moniors according to last section here: https://github.com/dantmnf/MHC2/blob/master/README.md
This is not the case for ACM/HDR. Current implementation always use sRGB piecewise function to process SDR content.
@Malus-risus You can have a look at monitor reviews here and would find most monitors are factory calibrated to power 2.2: https://www.rtings.com/monitor
I am unable to determine whether rtings is data measured while the monitor was in SRGB mode.
@Malus-risus You can have a look at monitor reviews here and would find most monitors are factory calibrated to power 2.2: https://www.rtings.com/monitor
I am unable to determine whether rtings is data measured while the monitor was in SRGB mode.
@Malus-risus When the monitor have sRGB mode and is being used, it would be listed in pre-calibration test info, for example: https://www.rtings.com/monitor/reviews/asus/proart-display-pa279crv
Also Apple’s display p3 monitors are doing the same as sRGB, report EOTF curve in TRC tag as sRGB, while actually calibrated to power 2.2
@allenwp @Malus-risus I found this video: https://www.youtube.com/watch?v=NzhUzeNUBuM
According to the video, piecewise OETF encoding + power 2.2 EOTF monitor is a very old implementation for implicit flare light compensation, which should be done explicitly now. Also their newer version of software sets to power 2.2 EOTF/OETF as default now.
Windows defaults to power 2.2 EOTF for moniors according to last section here: https://github.com/dantmnf/MHC2/blob/master/README.md
This is not the case for ACM/HDR. Current implementation always use sRGB piecewise function to process SDR content.
@dantmnf Does this mean windows always use piecewise OETF for encoding colors that would be send to monitors when ACM/HDR enabled?
Does this mean windows always use piecewise OETF for encoding colors that would be send to monitors when ACM/HDR enabled?
Windows uses the piecewise EOTF - matrix - piecewise OETF pipeline in SDR/ACM mode by default, but the OETF can be altered with MHC2 LUT. MHC2Gen uses inverse TRC in the input ICC profile as OETF in the pipeline.
This has been asked by the ACES crew and answered by Jack Holm, the technical secretary for IEC/TA 100/ TA 2 which developed the IEC 61966-2-1 sRGB standard.
from https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024/2
The "preferred" EOTF is not 2.2, but piecewise sRGB. BT.2380 also states the EOTF noted here:
The reasoning behind sRGB was to define a standard. If you look in BT.709-3, there's no standard gamma to be found: https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.709-3-199802-S!!PDF-E.pdf
If you look at BT.470-6 this was the table:
which had different illuminances and assumed gamma, which was always an approximation of the analog displays. Today the BT.709-6 just says to use BT.1886 which is 2.4 gamma.
Even in 2002, it was still decoded in sRGB under reference viewing conditions:
The idea was to be close enough to 2.2 to be acceptable. Yes, it's extremely ironic that an standard to make things clearer ended up being more confusing. You can also look at the original draft at https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/4/1/art00061 and see that, in reality, the idea was to be as close to 2.2 as possible. The majority of displays were basically 2.2 and sRGB was looking to push that as the actual standard and built around that.
But, at the same time the spec discusses this is an "optimization". It will mismatch:
One impact of this encoding specification is the creation of a mismatch between theoretical reference display tristimulus values and those generated from the encoding implementation. The advantages of optimising encoding outweigh the disadvantages of this mismatch.
My understanding is the "theoretical reference display" would be pure 2.2 which was always just an approximation of CRTs, calling back Jack Holm said. But now, we can target symmetry thanks to our digital display technologies and see the differences in black floor due to OLED displays. It was never going to be perfect, but be "consistent" or "compatible".
At the same time you have to consider how content was authored. The intent was to be viewed on 2.2 despite the inaccuracies. Do you want to have the disadvantages as stated by the spec, and accept them (ie: 2.2 on sRGB is mismatched)? Or do you want to "improve" on the original intent and have something without the mismatch (ie: sRGB decode for sRGB content)?
I don't think it's a "bug" as stated in the topic title. But it's a understanding that the original 2.2 was an approximation with trade-offs that are no longer needed. There's no right answer either. Digital displays were adjustable enough to be able to choose between 2.2 and sRGB, so IMO, it really depends on what displays were used at the time of the time of authoring. It's better as an option, because even content designed for CRTs could have been 2.35 not 2.2 either.
Jack Holm was not the primary author, nor was he part of the authorship consultation, nor mentioned anywhere in the acknowledgements of the documents that predate the specification.
End of story.
If you want to find facts, speak with the primary author, Michael Stokes.
Anyone who tries to make a case for the two part would be invalidating the vast majority of hardware overnight, as the vast majority are pure 2.2 power function EOTFs. Folks have to remember that some folks insist on trying to rewrite history. Let’s not do so.
If one wants to “improve” the stimuli round trip, drop the two part nonsense, and use the generic 2.2. No need to complicate things with a two part function in 2025.
@sobotka If you read the spec you'll see it mentions usage of sRGB on printers and scanners. I've never heard 2.2 on either of them. I would love to hear of otherwise. There's a space for accurate, symmetrical inverse decoding of sRGB. I can't imagine having misprinted color on printouts being an intended feature. There's a reason sRGB modes existed on LCD displays.
There’s a reason that we are talking about EOTFs.
If you speak with the primary author, you can get the history of why that two part ended up in the specification, and why the “evidence” provided by some folks should be taken with a huge lump of salt.
Is formula 1 unclear?
There's a reason sRGB modes existed on LCD displays.
There’s a reason that the vast majority of displays are pure 2.2 EOTFs natively. What makes more sense? Targeting an OETF incorrectly as an EOTF, or simply using the normative EOTF for stimuli encoding?
No need for two part function obfuscation.
I think that:
-
All colorimetric conversions must assume that the EOTF/OETF is piecewise sRGB.
-
The display's EOTF, which is equal to a power of 2.2, must be taken into account in the image itself. This de facto happens automatically during color correction using an sRGB reference monitor.
The situation is similar to Rec. 709/Rec. 1886. A display with a gamma of 2.4 displays an image as if it were taken with a camera with OETF Rec.709, but in fact, it is almost always an image prepared for display on a display with a gamma of 2.4.
There’s a reason that we are talking about EOTFs.
If you speak with the primary author, you can get the history of why that two part ended up in the specification, and why the “evidence” provided by some folks should be taken with a huge lump of salt.
Is formula 1 unclear?
There's a reason sRGB modes existed on LCD displays.
There’s a reason that the vast majority of displays are pure 2.2 EOTFs natively. What makes more sense? Targeting an OETF incorrectly as an EOTF, or simply using the normative EOTF for stimuli encoding?
No need for two part function obfuscation.
I want to play devil’s advocate here, because while I agree with you for the most part, I don’t think the specification is being represented fairly as it is written.
What if the content were encoded as described in the specification when the mastering device has an ICC profile?
And then decoded on a device with an ICC profile, as described in the specification:
In this workflow, there is no 2.2 power function or reference display in the chain. sRGB is merely an encoding, and to properly reproduce XYZ values and apply ICC color management, it must be decoded using the sRGB inverse function.
While I obviously agree that most content produced today falls into the category of "sRGB-compliant displays", which typically use a pure 2.2 power function, the "mismatch" between the reference display and the sRGB encoding was, at least in theory, introduced to compensate for the reference viewing conditions described in the spec.
Anyone who tries to make a case for the two part would be invalidating the vast majority of hardware overnight, as the vast majority are pure 2.2 power function EOTFs. Folks have to remember that some folks insist on trying to rewrite history. Let’s not do so.
If one wants to “improve” the stimuli round trip, drop the two part nonsense, and use the generic 2.2. No need to complicate things with a two part function in 2025.
I recognize that you've done extensive research on this topic over the years, but I don't see any "rewriting of history" here. History was written and ratified in IEC 61966-2-1:1999. Whether two part function was intended to be there by original author is irrelevant, because the final version has it. I fully agree that in 2025 the two-part function causes more harm than good, but ignoring it is not adhering to the specification, it's effectively working around it to adjust for current hardware and viewing environments.
It seems that at the time, the "mismatch" was considered negligible, it was acknowledged but deemed "close enough" to be acceptable. Now, however, that’s clearly no longer the case. Using the two-part function as defined in the sRGB standard creates more problems than it solves, and using a pure 2.2 power function produces better results in practice, especially since most PC content is mastered on sRGB compliant displays without ICC color management applied.
All colorimetric conversions must assume that the EOTF/OETF is piecewise sRGB.
That's the point, you don't know that. If content was mastered on gamma2.2 display and tagged as sRGB using piecewise makes no sense, as it were never used in encoding and we know by now that "glare" compensation is no longer relevant in 2025.
If however sRGB image were produced by using colorimetric transformation from different colorspace than it's encoded as piecewise sRGB and to avoid "mismatch" it should be decoded as such.
I know I oversimplify things, but I think there is no perfect solution. However, I agree that pragmatic thing to do is to assume gamma2.2 for sRGB image, which most likely will preserve the expected appearance and how it looks at any modern display.
History was written and ratified in IEC 61966-2-1:1999. Whether two part function was intended to be there by original author is irrelevant, because the final version has it.
I have made precisely this point in other threads. We could go a step further and suggest that even if the two part were intended as the EOTF, which the document is very clear is not the case, that it shouldn’t matter either; the normative historical use should likely drive the “pragmatic” interpretation.
That's the point, you don't know that. If content was mastered on gamma2.2 display and tagged as sRGB using piecewise makes no sense,
Agree!
as it were never used in encoding and we know by now that "glare" compensation is no longer relevant in 2025.
It should be noted that the idea of the purpose being “glare” was largely an interpretation by a party. I do not believe it is specifically outlined in the reference as having this purpose?
If however sRGB image were produced by using colorimetric transformation from different colorspace than it's encoded as piecewise sRGB and to avoid "mismatch" it should be decoded as such.
Agree, assuming that your intention is to “decode as presented”.
Here again we can see the meaning of the physicalist relative wattages should be considered relative to the intention of the author. However, if the author encodes relative wattages using some arbitrary f(x), the f(x) becomes irrelevant if the as presented evaluation of the stimuli uses g(x); the “correct” interpretation would be the cognized, as presented, relative wattage units g(x).
The as presenting operator drives the meaning of the relative wattages of the encoding.
