linux-pipewire: Fix render technique in captures again
Description
Fixes #12412 Supersedes #12422
Changes the linux-pipewire render code to use the correct rendering techniques
Motivation and Context
Adding an effect to the pipewire capture source leads to a decrease in gamma. While unsure about why exactly this happens, this PR copies the function for finding the right render technique from the Windows dc-capture and correctly handles the so far only supported non-linear format.
How Has This Been Tested?
Tested each combination of:
- 10-bit vs 8-bit in my compositor settings
- 16-bit P416 vs 8-bit NV12 in obs settings
- Rec. 709 vs Rec. 2100 (PQ)
- With "Render Delay" effect vs without. (As from the original issue)
Additional testing with hardware cursor capable machine is required. Same for traditional 8-bit displays (I do not trust Hyprland, better be safe!)
Types of changes
- Bug fix (non-breaking change which fixes an issue)
Checklist:
- [x] My code has been run through clang-format.
- [x] I have read the contributing document.
- [x] My code is not on the master branch.
- [x] The code has been tested.
- [x] All commit messages are properly formatted and commits squashed where appropriate.
- [x] I have included updates to all appropriate documentation.
While unsure about why exactly this happens
It probably happens because the current code will potentially de-gamma twice:
- Once because the
DrawSrgbDecompressshader is used, which will convert from sRGB gamma into linear gamma - A second time because
gs_effect_set_texture_srgbenables OpenGL to automatically convert every RGB value sampled from a texture into linear gamma
The functions gs_framebuffer_srgb_enabled, gs_effect_set_texture_srgb, and gs_effect_set_texture need to be set correctly in relation to the fragment shader selected, as they have explicit effects on the colours:
gs_framebuffer_srgb_enabled- will enable/disable automatic conversion of linear RGB values returned by the fragment shader into sRGB gamma encoded valuesgs_effect_set_texture_srgb- will enable automatic conversion of sRGB gamma encoded values returned by texture samplers into linear RGB valuesgs_effect_set_texture- will not do any conversions, RGB values returned from the sampler are interpreted as "already using linear gamma"
By default OBS Studio assumes that the video output texture uses either sRGB (or sRGB-alike) gamma encoded values or PQ/HLG-encoded values for HDR video. So the scenarios in which the frame buffer should not be "sRGB-enabled" is when the fragment shader will write RGB values that are already encoded with sRGB gamma (so the values are written "straight" into the render target).
Likewise if a sampled texture is known to have RGB values with linear gamma, you'd use the plan method to set the texture, as the fragment shader can work with the RGB values as-is.
Hyprland is not an implementation I would use as primary testing ground (since they stop using wlroots).
It's Mutter (GNOME) or KWin (Plasma) that should serve as primary testing ground.
@PatTheMav Thanks for your explanation! So essentially what I'm doing is converting to and from twice, where I could be skipping any conversion entirely. I'll dig through my code again and try to implement what you said.
Let me try to document this here. The assumption I was making is that when the compositor provides a 10-bit texture, it would not be gamma-encoded sRGB. That was incorrect. Instead, I believe I had flipped my understanding of what method does gamma-to-linear and reverse.
As far as I understand it now, and it seems to actually work on my system, the source image is always in gamma-encoded sRGB. When OBS is set to an 8-bit based format, it too is in a gamma-encoded colorspace. All one has to do is use "Draw", with framebuffer srgb disabled and the normal image set method. When OBS is switched into a 10-bit or 16-bit format, it uses a linear color space for the framebuffer and so "DrawSrgbDecompress" must be used instead. As for changing the colorspace to HDR PQ or HLG, no tonemapping is implemented anyways, so that doesn't even need to be handled differently at all (which was another wrong assumption I had).
I would like to understand when the obs colorspace is changed to scRGB because changing to any of the HDR colorspaces does not actually do that. I understand from the comment that "GS_CS_709_EXTENDED" is something used only on MacOS where this pipewire capture doesn't run anyways, but "GS_CS_709_SCRGB" says it is Linux HDR, which would imply it is enabled when switching to an HDR color space, am I wrong?
Also, I would like to ask about this code sample. Is there any reason to store and revert the framebuffer srgb state? I first saw this in the Windows code and it already confused me.
const bool previous = gs_framebuffer_srgb_enabled();
gs_enable_framebuffer_srgb(false);
I let Pat anwser your questioning but for that part:
but "GS_CS_709_SCRGB" says it is Linux HDR
When this line of code was added, Linux as in Wayland did not have much about HDR. Wayland color management protocol were still in development, I think some assumption were made that may not meet today situation.
With this third commit (you'll have to squash as one commit and force-push later), it seems that the the render delay issue is no longer there.
Edit: The second had potential DrawSrgbDecompress with gs_effect_set_texture_srgb which was not good even if it "worked".
Let me try to document this here. The assumption I was making is that when the compositor provides a 10-bit texture, it would not be gamma-encoded sRGB. That was incorrect. Instead, I believe I had flipped my understanding of what method does gamma-to-linear and reverse.
You can't make that assumption based on the texture format alone. A2RGB10 (or BGR10A2) would either be suited for colour spaces like D65-P3 or any other similar space that has a wider gamut than sRGB. Usually those colour spaces still use the sRGB gamma curve because they are effectively SDR formats.
However you might also get HDR10 data with either PQ or HLG transfer functions, as both require only 10-bit colour and using a 32-bit format might be more efficient than a 64-bit format.
So you can't assume that "10-bit" means the colours are encoded using sRGB gamma, or HLG, or PQ, either way. Ideally the compositor metadata would tell you explicitly.
As far as I understand it now, and it seems to actually work on my system, the source image is always in gamma-encoded sRGB. When OBS is set to an 8-bit based format, it too is in a gamma-encoded colorspace.
Kinda correct - OBS Studio expects the colours written to the render target (the main output texture) to always use sRGB gamma encoding for sRGB, Rec.709, and Rec.601 output modes.
All one has to do is use "Draw", with framebuffer srgb disabled and the normal image set method.
That's not always correct, and admittedly the way this is supposed to work is not very intuitive but was designed this way to avoid breaking functionality of existing plugins and sources:
- The basic idea is that OBS can "tell" a source to behave in a sRGB-aware way via the state returned by
gs_get_linear_srgb.- If that is the case, it usually means that the source should enable automatic sRGB gamma encoding on the framebuffer
- This leads to the requirement that the fragment shader has to return RGB colours with linear gamma (it should not encode sRGB gamma itself)
- In turn this means that the data provided to the fragment shader needs to use linear gamma
- If the source of colour data is a texture, the correct command needs to be used:
gs_effect_set_texture_srgbif the texture's colour data should potentially be automatically decoded from sRGB gamma (only valid for textures usingGL_SRGB8orGL_SRGB8_ALPHA8).gs_effect_set_textureif the no automatic decoding from sRGB gamma should take place for textures singGL_SRGB8orGL_SRGB8_ALPHA8
- So a source needs to ensure that the current flow is upheld:
source texture -> linear RGB colour data -> fragment shader -> linear RGB colour data -> render target
[!IMPORTANT] If a texture uses a colour format that does not explicitly use sRGB gamma, but it is known (by convention) that the colour data in the texture uses it, then an appropriate fragment shader has to be used that decodes the samples colour values "manually"
- Any source that is not sRGB-aware would effectively ignore this new state variable:
- The source would set
gs_effect_set_texturefor any texture - The source would instead maybe choose a different fragment shader like
DrawSrgbDecompressto explicitly decode from sRGB gamma in the shader itself, but has to be sure about the gamma curve used to generate the colour values in the texture
- The source would set
So there can be scenarios where gs_get_linear_srgb will return false and thus even an sRGB-aware source will mostly behave like a "legacy" source: The fragment shader will usually read and write all colour values without automatic gamma encoding or decoding, which can make sense for cases like a source actually rendering into an intermediary render target when OBS does not expect any blending to happen.
When OBS is switched into a 10-bit or 16-bit format, it uses a linear color space for the framebuffer and so "DrawSrgbDecompress" must be used instead. As for changing the colorspace to HDR PQ or HLG, no tonemapping is implemented anyways, so that doesn't even need to be handled differently at all (which was another wrong assumption I had).
~~As I mentioned above, DrawSrgbDecompress should probably not be used at all, instead the correct texture call needs to be used to allow the graphics API to automatically decode sRGB gamma from a texture if necessary. DrawNonlinearAlpha is a different story, because it handles cases of premultiplied alpha IIRC.~~
I was wrong about this - texture formats like GL_RGB10_A2 will not trigger OpenGL's automatic gamma decoding even though the 10-bit colour values might actually have sRGB gamma applied to them and thus the fragment shader has to decode them explicitly.
I would like to understand when the obs colorspace is changed to scRGB because changing to any of the HDR colorspaces does not actually do that. I understand from the comment that "GS_CS_709_EXTENDED" is something used only on MacOS where this pipewire capture doesn't run anyways, but "GS_CS_709_SCRGB" says it is Linux HDR, which would imply it is enabled when switching to an HDR color space, am I wrong?
GS_CS_709_SCRGB is meant for use with DXGI_COLOR_SPACE_RGB_FULL_G10_NONE_P709 on Windows, as the 1.0 colour value is supposed to represent "80 nits" brightness. The idea is to use Rec 709 colour values to be fully compatible with SDR colour, but any HDR colour goes "beyond" that value (scRGB uses negative values for that). OBS uses 300 nits as "paper white" by default, so 1.0 at 300 nits has to be converted into the appropriate colour value at 80 nits for this format.
GS_CS_709_EXTENDED is similar, in that it also uses Rec 709 colour values, but HDR colours just go beyond 1.0 - the colour values (and gamma curve) are simply extended beyond the SDR endpoint. macOS uses this for EDR, where 1.0 is mapped to the current brightness setting of the display and brighter colours are dynamically mapped to use the "brightness headroom" of the display.
Those values are typically used by the main preview and other projectors and updated when Qt calls the "move" or "display change" callbacks.
HDR content in OBS uses GL_RGBA16F which also allows colour values to just go "beyond 1.0", which are then converted back into colours ranging from 0.0 to 1.0 using the Rec 2100 colour space with either HLG or PQ transfer functions.
As this fix is not yet finalized, I am inclined to merge the revert (#12422) to unbreak SDR (likely the more common use-case) until this can be fixed.
I hope that won't cause too many merge conflicts, but I think that is a good idea for now as I'm going to be busy these next few days.
Successful rebase... I think