three.js
three.js copied to clipboard
WebGPURenderer: Add HDR Support
Description
Enjoy Threejs in HDR 🚀 (requires WebGPU support and an HDR-capable monitor).
Check out the difference:
HDR: https://raw.githack.com/renaudrohlinger/three.js/utsubo/feat/hdr/examples/webgpu_tsl_vfx_linkedparticles.html
SDR: https://threejs.org/examples/webgpu_tsl_vfx_linkedparticles.html
This contribution is funded by Utsubo
📦 Bundle size
Full ESM build, minified and gzipped.
| Before | After | Diff | |
|---|---|---|---|
| WebGL | 338.39 78.91 |
338.39 78.91 |
+0 B +0 B |
| WebGPU | 566.01 156.49 |
566.08 156.52 |
+76 B +29 B |
| WebGPU Nodes | 564.61 156.25 |
564.69 156.27 |
+76 B +26 B |
🌳 Bundle size after tree-shaking
Minimal build including a renderer, camera, empty scene, and dependencies.
| Before | After | Diff | |
|---|---|---|---|
| WebGL | 469.82 113.62 |
469.92 113.65 |
+105 B +25 B |
| WebGPU | 637.26 172.46 |
637.44 172.51 |
+182 B +55 B |
| WebGPU Nodes | 591.9 161.69 |
592.09 161.75 |
+182 B +57 B |
Related: https://github.com/ccameron-chromium/webgpu-hdr/blob/main/EXPLAINER.md
So I guess when using hdr: true, the developer should not configure any tone mapping, right? Should we check that and provide a warning if the developer attempts to do that? Meaning:
renderer = new THREE.WebGPURenderer( { antialias: true, hdr: true } );
renderer.toneMapping = THREE.ACESFilmicToneMapping; // -> produces warning
What about outputColorSpace? Is it correct to use SRGBColorSpace in the demo? It seems not, since the renderer would attempt to convert unbound HDR texels to sRGB which isn't right.
To clarify: webgpu_tsl_vfx_linkedparticles should have used tone mapping in the first place.
I agree with the proposed solution.
@CodyJasonBennett warned me about the toneMapping issue and we could indeed probably just warn the developer about the fact that both HDR and tonemapping cannot coexist yet.
As for outputColorSpace, I'd value @donmccurdy's input on this matter.
What about
outputColorSpace? Is it correct to useSRGBColorSpacein the demo? It seems not, since the renderer would attempt to convert unbound HDR texels to sRGB which isn't right.
Your intuition is correct. When rendering out in HDR, you send the physical/lighting units (candelas, nits). The display does the rest, including conversion into the electric signal used by the display, which is what sRGBTransferOETF does in LDR. We can still do a view transform though and preserve the (de)saturation of a tonemapper, but I haven't seen an implementation that outputs HDR. As they are fitted, they would require a custom implementation and switch depending on output parameters (not enough precision to do both in one).
CodyJasonBennett warned me about the toneMapping issue and we could indeed probably just warn the developer about the fact that both
HDRandtonemappingcannot coexist yet. As foroutputColorSpace, I'd value donmccurdy's input on this matter.
I've chatted with Don about this since I'm eager to see a real comparison with tonemapping in LDR and HDR (simply disabling tonemapping doesn't compare), but it's a lot of work to implement still. I'm happy to upstream tonemappers here if we can figure out an API. Just a lot of unknowns on top of historical problems and inconsistencies from display manufacturers, which complicate this. I'd be more confident with an API once we have direction here.
Some caution — our output to the drawing buffer must still be a formed image, we cannot send the stimulus (i.e. scene lighting / luminance) directly to the drawing buffer; the browser/device/OS does not provide image formation any more in “HDR” than in “SDR”. A lot of recent HDR demos on social media have made this mistake by omitting tone mapping. We do still want to consider output units (likely nits), as they relate to the viewing device/display.
WebGPU HDR, as currently shipped in Chrome, tells us nothing about the display, so we are guessing heavily. The amount of HDR headroom available may vary: turn your laptop brightness up, headroom reduces, different color management may be required. This is a major gap in the current WebGPU implementation in Chrome, and something we may need to keep tabs on for changes. And as @CodyJasonBennett says well, “a lot of unknowns on top of historical problems and inconsistencies” exist outside of Chrome's control.
I have a general idea of how to adapt AgX or ACES Filmic for use with “HDR“ output, and I'll look into that a bit. Desaturation is fortunately orthogonal: representation as “SDR” vs. “HDR” does not imply any large difference in saturation. If the comparison does diverge then something is likely wrong.[^1]
What about outputColorSpace? Is it correct to use SRGBColorSpace in the demo? It seems not, since the renderer would attempt to convert unbound HDR texels to sRGB which isn't right.
Your intuition is correct. When rendering out in HDR, you send the physical/lighting units (candelas, nits).
A quick test here would be to render a solid MeshBasicMaterial (example: #ff8c69) with HDR mode enabled. I believe we'll get the expected result when keeping .outputColorSpace = SRGBColorspace. There is no rule that an sRGB value cannot exceed 1... and I think the WebGPU explainer is saying indeed that we must do so, but the WebGPU explainer is not as clear as I'd prefer. I wish they'd offered rec2100-hlg too, that is easier to reason about for our purposes, and hopefully that's coming.
[^1]: Historically we've made tone mapping very easy for users, and color grading we've left as advanced/DIY ... and I think this has lead to some misconceptions. Adjusting saturation above/below tone mapping defaults is reasonable — and beneficial more often than not!
Possible API:
import { ExtendedSGRBColorSpace } from 'three/addons/math/ColorSpaces.js';
const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });
renderer.outputColorSpace = ExtendedSGRBColorSpace;
My main concern: Extended sRGB (the only option WebGPU currently allows) is not really an appropriate output color space for lit rendering; we need to render an “HDR” image in reference to a well-defined display/medium. I'll file an issue on the WebGPU spec repo about this (EDIT: https://github.com/gpuweb/gpuweb/issues/4919); perhaps there are plans to enable other output color spaces. I would prefer to have this:
import { Rec2100HLGColorSpace } from 'three/addons/math/ColorSpaces.js';
const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });
renderer.outputColorSpace = Rec2100HLGColorSpace;
Adaptations to tone mapping are also needed, though they depend on information we do not have, and which may not be well-defined at all when using Extended sRGB.
I know others are excited about the “HDR” features though — would it be possible to start with a PR that exposes outputType: HalfFloatType and confirm that everything still works as expected, before we continue with next steps? I'm hoping to avoid major color changes for existing scenes like r152, which I'm afraid will be necessary to transition from WebGPU's current HDR API to a correct image formation pipeline.
Happy new year everyone !
@donmccurdy From user perspective as you said it's such an exciting feature but ColorSpace change was an hard one ( and still involve extra code/debug for me sometimes ) ..
Do you think it would be possible to make a simple API like :
const renderer = new WebGPURenderer({ hdr: true }); and default to sdr if hdr isnt support, like the automatic fallback to webgl2?
I don't feel that a boolean HDR on/off flag would be the right approach. We should really provide the option to use a high-precision drawing buffer, whether or not the device supports HDR. And we may want to provide the option to render into an HDR canvas for export or baking purposes, on devices that support HDR canvas but don't currently have an HDR display connected. So I do prefer the API suggested in https://github.com/mrdoob/three.js/pull/29573#issuecomment-2412541524.
This has the added benefit of not locking us into Extended sRGB, which I feel would be a huge mistake.
Here's a small PR to provide an outputType parameter, which can be used independently of HDR rendering:
https://github.com/mrdoob/three.js/pull/30320
I got HDR working without WebGPU! https://x.com/samddenty/status/1915922922234916906?s=46&t=BHioRA7yXyP06sjXuJYPRA
For those wondering: It's the video overlay trick, combined with a luminance rendering pass: https://github.com/mrdoob/three.js/compare/dev...samdenty:three.js-hdr:dev#diff-3274f1a37032fb0ae4e2823def0007c634e869ae0dfc304ff6a12c36513c3a52R102-R106 Nice idea :)
How about we aim to ship this awesome feature in r179? It should require only minimal changes but would deliver a super cool and much-anticipated improvement! ^^ /cc @sunag @mrdoob @Mugen87
I'm not sure about this. @donmccurdy highlighted that the current WebGPU HDR spec is somewhat incomplete, see https://github.com/gpuweb/gpuweb/issues/4919.
It's also not clear to me how to configure tone mapping and output color space if the tone mapping mode is set to extended (meaning hdr: true). It seems disabling the image formation steps (tone mapping and color space conversion) isn't correct. However, the current supported output color spaces can't produce an HDR output. So it looks like it's not just a simple boolean we have to provide but something like https://github.com/mrdoob/three.js/pull/29573#issuecomment-2412541524.
@donmccurdy IMO, you know best about HDR, tone mapping and color spaces. https://github.com/mrdoob/three.js/pull/30320 was a first step but how do you propose we should move forward?
Yes, the WebGPU HDR spec is missing a foundational part today. I'm (personally) OK with shipping experimental support in three.js before that is resolved. But I think to do that correctly we need:
- To provide
.outputColorSpace = THREE.ExtendedSRGBColorSpaceinstead ofhdr: true - To provide basic tone mapping support into the extended range, at least for ACES Filmic and AgX, other tone mapping not required
For tone mapping — instead of mapping from [0, 1], we map to [0, X], where "X" is some arbitrary number that "looks good" on available displays. Ideal choice of "X" is currently unknowable, and varies as screen brightness changes (see https://github.com/mrdoob/three.js/pull/29656#discussion_r1800234907), this is part of the problem with the spec. This change can be hard-coded into the ACES and AgX tone mappers.
I suggest this be experimental because (1) "Extended sRGB" is conceptually not great for this purpose, and (2) the arbitrary choice of "X" in tone mapping is fragile. If/when the spec allows alternatives for either, we'd want to fix those, but can't do so in a backwards-compatible way (rendered results will change).
@RenaudRohlinger would you like to adapt this PR to handle (1)? The changes could be:
- In
addons/math/ColorSpaces.js, addExtendedSRGBColorSpace:- Implementation would be identical to SRGBColorSpace, except that
<space>.outputColorSpaceConfigshould be something like{ drawingBufferColorSpace: SRGBColorSpace, toneMappingMode: 'extended' }
- Implementation would be identical to SRGBColorSpace, except that
- In
src/math/ColorManagement.js#L37, add 'toneMappingMode' to the ColorManagement.spaces JSDoc comment - In WebGPURenderer, configure context 'toneMapping.mode' accordingly
Example usage:
import { ColorManagement, ACESFilmicToneMapping } from 'three';
import { ExtendedSRGBColorSpace, ExtendedSRGBColorSpaceImpl } from 'three/addons/math/ColorSpaces.js';
ColorManagement.define( { [ ExtendedSRGBColorSpace ]: ExtendedSRGBColorSpaceImpl } );
renderer.outputColorSpace = ExtendedSRGBColorSpace;
renderer.toneMapping = ACESFilmicToneMapping; // requires (2)
We can defer (2) for another PR, though it would be important to get them in the same release I think.
@donmccurdy I had to also implement ExtendedLinearSRGBColorSpace to support PostProcessing, because WebGPU requires float16 format for HDR. The render method of PostProcessing was forcing a LinearSRGBColorSpace and so the canvas would fallback to a default BGRA8Unorm, which breaks compatibility with WebGPU.
From what I understand I think that we can't simply resolve the final render to float16 while keeping framebuffers in BGRA8Unorm, since that conflicts with the initially configured WebGPU context, which now uses float16 with extended colorSpace (and would throw an error). Also, reconfiguring the context at runtime is not recommended and potentially problematic.
Hm, there's at least one pre-existing trouble spot, which isn't this PR's fault, but I'm not sure if we can do the new "ExtendedLinearSRGBColorSpace" constant:
- We shouldn't be hard-coding LinearSRGBColorSpace in PostProcessing.js, the default should be
ColorManagement.workingColorSpace(which is configurable) - Having output color space configuration on our working color space is conceptually a bit backwards, LinearSRGBColorSpace should be a valid working color space even with an HDR output color space
The render method of PostProcessing was forcing a LinearSRGBColorSpace and so the canvas would fallback to a default BGRA8Unorm...
The use of a 8-bit or 16-bit drawing buffer should only be depending on renderer.outputType at this point — output to LinearSRGBColorSpace with a 16-bit drawing buffer should be fine, even without any of these HDR changes. Is that not working as expected? Or is the issue more that we're changing between two outputColorSpace settings with different 'toneMapping.mode' configurations?
I see, make sense! I removed LinearSRGBColorSpace, and make use of outputType instead. Should be good now.
Users will need to be aware of the fact that they need to set renderer.outputType = THREE.HalfFloatType in order for the HDR to work in that case.
I was trying to handle outputType automatically based on the toneMappingMode but with your comment I understand that's two separate thing.
// updated example usage:
import { ColorManagement } from 'three';
import { ExtendedSRGBColorSpace, ExtendedSRGBColorSpaceImpl } from 'three/addons/math/ColorSpaces.js';
ColorManagement.define( { [ ExtendedSRGBColorSpace ]: ExtendedSRGBColorSpaceImpl } );
renderer = new THREE.WebGPURenderer({ outputType: THREE.HalfFloatType })
renderer.outputColorSpace = ExtendedSRGBColorSpace;
In another PR I suggest that we should make a very basic example to demonstrate the difference with HDR, such as a basic material plane or a cube on a transparent canvas with a white or black background. Something like: https://ccameron-chromium.github.io/webgpu-hdr/example.html
Thanks @RenaudRohlinger! I'll do some testing this week but the implementation looks good to me now. I'm hoping that the WebGPU spec will also add support for the Rec. 2100 HLG color space as an alternative to Extended sRGB, which could address my concerns in https://github.com/mrdoob/three.js/pull/29573#issuecomment-3084504309. In that case we'd potentially have a choice of float16 or rgb10a2unorm as the output type when using HDR-capable displays.
And of course using float16 without an HDR output color space is also an option, and would improve image quality in some cases.
Let's merge this shortly after the r179 release, so it's available in r180? Just want to make sure we have time to include tone mapping.
@donmccurdy While doing your tests this week, if you have a chance to looking an issue I'm running into regarding Display P3 support in WebGPUBackend.
When configuring the WebGPU context via this.context.configure(...), I expected colorSpace: this.renderer.outputColorSpace to be sufficient but P3 output doesn't seem to work correctly for the textures, unlike in WebGLBackend where everything works fine.
As far as I can tell, WebGPU doesn't have (nor need) the equivalent of
_gl.pixelStorei( _gl.UNPACK_COLORSPACE_CONVERSION_WEBGL, unpackConversion );
which WebGL uses to handle colorspace conversion for textures. So I’m wondering:
Is full P3 texture support something we need to explicitly handle in ColorSpaceNode or another TSL-side abstraction?
Could this limitation be related to how color transforms or EOTF/OETF are currently applied in the TSL pipeline?
I tried adding the following in WGSLNodeBuilder:
needsToWorkingColorSpace( texture ) {
// Skip if texture has no color space
if ( texture.colorSpace === NoColorSpace ) return false;
// Skip if already in working color space
if ( texture.colorSpace === ColorManagement.workingColorSpace ) return false;
// Otherwise convert
return true;
}
which is used to call convertColorSpace for textures but this doesn’t seem to be enough. Maybe I’m missing a step in how P3 textures should be handled on upload or in shader output in WebGPU? My understanding of color management is still pretty limited, so any insights on how this should be wired up in WebGPU vs WebGL would be much appreciated.
Thanks! 🙏
@RenaudRohlinger I'll try to take a closer look, but on first reading this might be related to WebGPU limitations in:
- https://github.com/gpuweb/gpuweb/issues/1715
Ideally we'd configure GPUTextureDescriptor resources to unpack into THREE.ColorManagement.workingColorSpace, but I believe WebGPU only supports that for video textures currently, with more on the roadmap. Failing that, we could do conversions on CPU (somewhat slow) or in TSL (somewhat inaccurate for mipmaps and interpolation).
The situation is inverted in WebGL: WebGL handles unpacking for image textures but not video textures, which we have to deal with in the shader. I think needsToWorkingColorSpace is currently handling only the WebGL scenario, with conversion disabled for non-video textures. Since the code is already there, I suppose we might as well handle WebGPU textures similarly (if only temporarily).
Thanks for the review! I noticed that if the Renderer outputType is not set to HalfFloat, WebGPU currently crashes silently.
I suggest adding a warning to make this issue more explicit. I'll address it in a separate PR.
While trying to get my older HDR example to work, I wondered: How, in the new TSL/WebGPU world, do I do the equivalent of "patching the tonemapping ShaderChunk"?
The following approach should be still valid: https://github.com/mrdoob/three.js/issues/28957#issuecomment-2304206620
Thanks @Mugen87. For me that's not really patching though, it's replacing a bigger part of the pipeline and hoping to match whatever the previous behaviour was. I don't think that's ideal – something that previously was simple (like patching AgX to not clamp the output just for testing) is now really complex... are there alternative approaches? I could imagine that a callback, that somehow gets the nodes and a context of where in parsing we are, might be useful for many usecases.
The previous approach was hardly ideal since patching raw shader code was error prone and fragile.
To me, it's a clearer workflow to create a custom node for a custom tone mapping and then apply it to the renderer pipeline. If the post processing approach is too complex for you, we maybe can investigate to make ToneMappingNode more flexible and allow custom tone functions on that level.
It's not just about tonemapping – e.g. imagine wanting to patch something into the lighting equations, camera matrix handling, a new environment mapping method, or other details. I do agree that patching is error prone and fragile – but it works at all without having to modify three.js itself. It would be great to be able to say „I know what I’m doing, override this node implementation with that“.