Post process framework roadmap
Initial implementation: #5615 5-year old notes
- [ ] Finish full screen effects in https://github.com/AnalyticalGraphicsInc/cesium/pull/5615
- [ ] Per-entity effects. Any entity should be able to have post processing effects applied to it.
- [ ] Per-feature effects for 3D Tiles
- [ ] Write object ids to a gbuffer - or separate render pass for entities with effects applied to them.
- [ ] Use MRT, handle translucent issues.
- [ ] Use scissor test around entities with effects applied.
- [ ] Use textureLOD to determine if an id was written to part of texture?
- [ ] Use object id gbuffer for picking
- [x] Consolidate depth buffers into a single depth texture rather than use the closest frustum's depth texture.
- [x] May not be needed if we move to a log depth buffer, where the first frustum will be a lot larger. https://github.com/AnalyticalGraphicsInc/cesium/pull/5851
- [ ] HDR rendering
- [ ] Effects
- [x] Black and white
- [ ] Brightness, contrast, hue, saturation
- [x] 8-bit
- [x] Night vision
- [x] Texture overlay
- [x] Depth view
- [ ] Gamma correction
- [x] Lens flare
- [x] Ambient Occlusion
- [ ] Silhoutte (replace stencil-silhoutte in Model)
- [x] Edge detection
- [x] Depth of field
- [x] Bloom (simple)
- [ ] Bloom (advanced - requires HDR)
- [ ] Fog
- [x] Gaussian blur
- [ ] Motion blur
- [ ] Glow
- [ ] SSRTGI?
- [ ] Use the post-processing framework wherever applicable throughout Cesium:
- [x] FXAA
- [ ] OIT
- [x] Sun bloom
- [ ] Picking buffer
- [ ] Invert classification
- [ ] Point cloud surface generation
- [ ] Atmosphere coloring
- [ ] Wide range of sandcastle examples highlighting use cases of post processing
- [ ] Tech blog post
- [ ] Reuse ping-pong textures: if we're not already doing this, I realized that when we are rendering to down-sampled textures, it would probably be better to just reuse a larger texture and render to a subset of it so we overall allocate fewer textures. Somewhat related, check out the Transient resource system in Frostbite. Note that I expect our textures to have a lifetime of multiple frames, and eventually be deallocated like the shader cache.
Later
- [ ] Post processing effects on particle systems (like heat haze, combining particles)
- [ ] Highlight objects that are occluded
- [ ] Cesium rendering blog post
- [ ] Longer-term transpiler/optimizer to combine multiple post-processing passes into one pass. Basically we provide a set of filters/effects/passes to a pipeline/whatever object, then the pipeline object is free to optimize as it sees fit. For more inspiration, see Designing a next-generation post-effects pipeline.
Links
- BabylonJS particle system post-processing
- BablyonJS post processing
- ThreeJS post processing
- Godot 3's Renderer Design Explained
- deus-ex graphics study
- Open Source Unity Post-Processing Effects
- An investigation of fast real-time GPU-based image blur algorithms
- Call for a new Post-Processing Pipeline
- Stylized Rendering in Spore
- Post Process Framework Sample
- The rendering technology of skysaga: infinite isles
- Deus Ex: Human Revolution - Graphics Study
- adventures in postprocessing with unity
- Next Generation Post Processing in Call of Duty: Advance Warfare
Nice article with a bit on their post processing: Godot 3's Renderer Design Explained
- 50 post-processing passes (not used all at once) in FrameGraph: Extensible Rendering Architecture in Frostbite.
- Use large triangle instead of two triangles - up to 10% on GCN
- Motion blur with velocity buffer in DOOM (2016)—Graphics Study
Possible idea from the forum: https://groups.google.com/forum/#!topic/cesium-dev/DJiirg0L5Cs
Render translucent buildings, but have the buildings still occlude each other.

It would require rendering the tileset to its own framebuffer with depth test/mask on, and then blending that framebuffer with the scene's framebuffer. It may not be hard to add this to the post processing branch as is.
Another request from the forum which should be very straightforward with post processing - change a tileset's contrast or saturation, especially useful for photogrammetry.
https://groups.google.com/forum/#!topic/cesium-dev/zQ0GEQ2SXDU
This could also replace the current approach for contrast/saturation/etc for imagery.
@bagnell @byumjin two questions on the current HBAO. If we haven't done these yet, they could be good roadmap items.
- [ ] Have we looked at Line-Sweep Ambient Obscurance to reduce the number of depth buffer reads by splitting the occlusion calculation into two steps?
- [ ] Instead of using the HBAO occlusion factor, have we looked at ground truth ambient occlusion (GTAO)? Is it fast and simple enough and produce notably better results for us to consider using it instead?
- Perhaps interesting reading: Post-processing Effects on Mobile: Optimization and Alternatives
@bagnell can you please update the tasklist at the top of this issue?
Now that HDR rendering is (at least partially) implemented, is it possible to use floating point textures in the post processing stage?
@laurensdijkstra Yes. You can pass pixelDatatype: PixelDatatype.FLOAT as an option to PostProcessStage, but you must also check that Context.floatingPointTexture and Context.colorBufferFloat are supported.
I have not been able to get the initial HDR rendered image of the Scene with pixelDatatype: PixelDatatype.FLOAT and HDR enabled. Is it possible to obtain the HDR rendered Scene texture in a PostProcessingStage?
Clarification as to how I have tried to determine whether the colorTexture uniform in the PostProcessingStage fragmentShader is HDR or not: I checked each RGB value to see if it exceeded 1.0 and output vec4(1e6, 1e6, 1e6) in the shader if so, black if not. The result was a black image.
When outputting vec4(1.0, 1.0, 1.0, 1.0) the image was white. This should not be the case with gamma 2.2 should it? I'm new to HDR stuff so please correct me if I'm wrong.
It seems Scene._view.sceneFramebuffer.getFramebuffer() is passed to PostProcessingStage.execute as the colorTexture, which I have checked to be HALF_FLOAT when HDR rendering is enabled. I do not understand why the values I checked with the method above do not exceed 1.0. They seem to be between 0.0 and 1.0. Are the HDR values collapsed before being sent to the post processing stages?
Looks like this is somewhat related: https://github.com/AnalyticalGraphicsInc/cesium/issues/5932
@pjcozzi
https://github.com/godotengine/godot-proposals/issues/8576