ImageOverlays: Provide method for realtime dynamic overlays
Question
I have a render pipeline that updates frame buffer objects. I'm able to display the resulting textures by injecting them instead of image textures by playing around with XYZ tile overlays. What I'm stuck on: I would like to run the pipeline at each animation frame and thereby animate the map. Is there a simple way to make this possible?
Supplemental Data
Library Version
v0.4.18
Three.js Version
r0.181.1
Perhaps I'm misunderstanding but if you have a handle to the textures your applying to the tile geometry then you can just update those textures every frame by rerendering.
Yes, I should be more specific. I use requestAnimationFrame to register my texture updates which are generated by my custom shader pipeline rendering into an FBO. The "updating texture" is then used as input to the TiledTextureComposer's draw function, where I also need to register an update for the rendering of the composed texture by packing a few lines below here in a function that also is "run" by requestAnimationFrame.
I haven't been able to find another render call downstream of the one in TiledTextureComposer's draw, the next one seems to be the main render loop (which then renders the whole scene). This one runs at each frame anyway. What I'm stuck on is this: I can see that the textures are updating themselves in the background, but I only see updates on screen when the tiles display at a delay from their creation. The depiction on screen however stays static, despite the main render loop constantly updating. It seems that I'm missing a render somewhere in between.
Can you provide some code for how you're doing this? And a more concrete example of what you're doing? Are you animating something like a weather simulation?
But either way, yes using the "ImageOverlay" functionality makes this more complicated than the "XYZTilesPlugin" which uses the texture on the tile directly since ImageOverlay textures are composed from multiple subtiles. There's not currently a way to update those composed textures and even adding a simple method to enable this (something like a plugin.refreshLayer( i ) function) wouldn't be ideal since it would mean updating all the sub tiles and then updating all the composed ones, which at least doubles the draws needed per frame.
I think an improved method would be to adjust the "ImageOverlay" classes or "TiledImageSource" instances to provide pre-composed textures that span the appropriate range more directly. Then this instance can be implemented to provide a texture that can be used by the layered materials directly and can therefore be more easily modified. Then sub image sources like GeoJSON, WMTS, or yours that don't really rely on strictly tiled data can control the handle to the textures and update them.
Hopefully that makes sense - let me know your thoughts. I'd be happy to support this if there's a compelling use case and a demo can be be provided.
Yes, I can provide some code that adds detail. I understand the issue with the multiple redraws, but at the moment I'm just looking to generate a proof-of-concept. I'm taking the route of the overlays, since the blending of multiple overlayed textures (of which one is animated, at the current stage of planning) is a functionality I would like to retain. The use-case is animated / time-dependent heat maps coming from various data sources including my own simulations. Let me give more detail first and then add my thoughts.
In TiledImageSource, I changed processBufferToTexture to return my render pipeline object of the class AnimatedData. The class AnimatedData has the method
animateToTexture(renderer, fps=24.0, t=0.0) {
requestAnimationFrame((t)=>this.animateToTexture(renderer, fps, t));
const interval = 1000.0 / fps;
if (this.lastUpdate == 0.0 || t >= this.lastUpdate + interval) {
this.toTexture(renderer);
this.lastUpdate = t;
}
}
Without any further modifications, the object of class AnimatedData gets fed to the data argument of the draw method in TiledTextureComposer, which I modified as follows:
draw( data, span ) {
// draw the texture at the given sub range
const { range, renderer, quad, renderTarget } = this;
data.animateToTexture(renderer);
const tex = data.texture;
const material = quad.material;
// map the range to draw the texture to
material.minRange.x = MathUtils.mapLinear( span[ 0 ], range[ 0 ], range[ 2 ], - 1, 1 );
material.minRange.y = MathUtils.mapLinear( span[ 1 ], range[ 1 ], range[ 3 ], - 1, 1 );
material.maxRange.x = MathUtils.mapLinear( span[ 2 ], range[ 0 ], range[ 2 ], - 1, 1 );
material.maxRange.y = MathUtils.mapLinear( span[ 3 ], range[ 1 ], range[ 3 ], - 1, 1 );
// draw the texture
this.animateTexture(tex, material, renderer);
}
animateTexture(tex, material, renderer, t=0.0) {
requestAnimationFrame((t)=>this.renderTexture(tex, material, renderer, t));
material.map = tex;
const currentRenderTarget = renderer.getRenderTarget();
const currentAutoClear = renderer.autoClear;
renderer.autoClear = false;
renderer.setRenderTarget( this.renderTarget );
renderer.render( this.quad, _camera );
renderer.setRenderTarget( currentRenderTarget );
renderer.autoClear = currentAutoClear;
material.map = null;
}
The rationale behind both animate... methods being: The first generates the data updates, the other just updates the composed texture (should be throttled to fps, I know, but POC & not performance is my intention right now). Now I would also need to continuously redraw any renders downstream of this composition, but the only one I was able to find is the one in the main render loop, which looks something like this:
function animator(t=0.0) {
requestAnimationFrame((t)=>animator(t));
render();
}
animator();
function render() {
controls.update();
camera.updateMatrixWorld();
for (const tiles of tile_sets) {
tiles.errorTarget = params.errorTarget;
tiles.setCamera( camera );
tiles.setResolutionFromRenderer( camera, renderer );
tiles.update();
}
renderer.render( scene, camera );
}
The suggestion of "not relying" on strictly tiled data would be nice, however it's not unlikely that I will run into future use-cases where I will need to hijack the 3DTilesRenderer tiling framework to request a quad-tree refinement of my server-side data.
since the blending of multiple overlayed textures ... is a functionality I would like to retain
Just for my information and so I can understand how the project is being used: are you migrating from another system?
Without any further modifications, the object of class AnimatedData gets fed to the data argument of the draw method in TiledTextureComposer, which I modified as follows:
/* ... */
So do you have a full proof of concept working? Or is this something that's still not functioning after these changes?
If you'd like to "force" a redraw (via composing tiles) of the textures per-tile geometry for a specific overlay you can use the following. Though bear in mind this isn't intended to be a long term or reliable solution (in fact it looks like there may be a bug here internally I'll have to check out later):
// the second argument indicates that the overlay will be removed and readded so the
// image source should not be completely disposed
this.deleteOverlay( overlay, false );
this.addOverlay( overlay, order );
The suggestion of "not relying" on strictly tiled data would be nice, however it's not unlikely that I will run into future use-cases
Can you clarify this line? It's "not unlikely"? Do you mean it is something you will likely need?
Just for my information and so I can understand how the project is being used: are you migrating from another system?
No, right now I'm deciding on a tech stack that I want to use for my next project. 3DTilesRenderer looked like a good choice, but I do need animation functionality. Climate data is just one of many use-cases that I'm looking into.
So do you have a full proof of concept working? Or is this something that's still not functioning after these changes?
There have been no changes since opening this issue. What I describe leads to the textures updating in the background but updates to them only showing once 3DTilesRenderer redraws "on its own" - so basically I see "random snapshots" from the animations history as I move the globe around and tiles dispose / reload, but not running animation.
If you'd like to "force" a redraw (via composing tiles) of the textures per-tile geometry for a specific overlay you can use the following...
This actually works, on mouse drag I see the tile's animation running. It (expectedly) kills a lot of other functionality under the hood, but it shows that the textures are updating correctly in the background.
Can you clarify this line? It's "not unlikely"? Do you mean it is something you will likely need?
Yes, this is something I would likely need - for instance, it is a very sensible use-case to load additional (finer) data needed in the front-end animation when the tiles refine.
Where do we go from here?
This actually works, on mouse drag I see the tile's animation running. It (expectedly) kills a lot of other functionality under the hood, but it shows that the textures are updating correctly in the background.
If it's only working on mouse drag I'm expecting your limiting some kind of render or tiles update on mouse change. Without seeing code I'm not sure why that would be happening but otherwise this is what I would have expected.
It (expectedly) kills a lot of other functionality under the hood
Can you elaborate? Are you referring to performance?
Where do we go from here?
I would like to see this in the project and think it has the potential for some very cool map visualizations. I think there are some possible short, medium, and long term options for supporting this at varying degrees of capability and quality:
Short Term
Add an "official" way to force tiled images to be recomposed - this will likely be needed to fix the bug I mentioned above anyway. But the drawback, as you may have seen, is that the performance may not be great due to recomposing and redrawing so many textures.
Alternatively the "XYZTilesPlugin" will generate a tile set for a smooth globe and assign the tiled images directly meaning there would be no need for relying on the plugin to recompose the textures. You lose the any elevation visualization, though.
Medium Term
As I suggested previously, we can adjust the "imageSource" API to accept a method for retrieving a tile that represents the "region" of the image that a geometry tile needs to display. All the existing tiled image sources (like TMS, XYZ) would be wrapped in something like a "ComposedRegionImageSource" that basically does all the composing that is happening inside "ImageOverlayPlugin" at the moment.
For overlays that don't require an underlying tiled structure we can implement the "RegionImageSource" API such that you just draw tiles as you need and can manage them out you please, animated them, etc. The GeoJSONOverlay would benefit from this change since at the moment it's generating a bunch of tiles and composing them even though it should be possible to just draw that final texture immediately. You can then provide your own custom one to handle the types of animation you'd like to the target textures.
Long Term
Long term it may be possible to allow you to just compose the images however you want during the final tile rasterization to the screen. Eg pass two or three textures to the tile geometry material and afford some kind of cool custom transition animation. This would free you from having to make draw calls to render to an intermediate render target and free you from the limitations of the render target resolution.
WebGPU shader nodes will hopefully bring more flexibility for this kind of thing from an API perspective since the current strategies are otherwise pretty rigid for shader overrides, but it may be a bit before we can add support (see #1380, #1179).
Let me know what you think. I will plan to fix the bug mentioned above (and possibly address the "short term" task here) but if you'd like to try to make a PR for the "medium term" approach (adding an intermediate "RegionImageSource" for overlays) that would be the best path forward at the moment, I think. I know it's not necessarily straight forward to dive into what is not a fairly in depth portion of the code base but I can provide any guidance and help smooth things out if you'd like to give it a go.
Also cc @makio64 since I know you had been doing some separate work for weather visualization and dynamic globe transitions. Maybe you have some insight? It would be nice to make a cool demo for this feature if there are any available public data sets that align well for this kind of thing.
I made a pretty significant refactor in #1413 which changes the "ImageOverlay" API surface to expose functions for "getting" "locking" and "releasing" overlay texture regions. This makes the "ImageOverlay" class responsible for all the generated overlay textures that are read directly by the materials when compositing the overlays which means you can update them as desired and therefore animate them. I can't promise this won't continue to change but it should give you something to work with to start and give some feedback. I adjusted the r3f/projection demo to include an animated geojson shape to prove it out:
https://github.com/user-attachments/assets/ff7342cf-0bf0-4ede-8148-b248d223d053
Some things that come to mind that could still be addressed here:
- For geojson, at least, every texture is being drawn immediately rather than being deferred to improve performance.
- "GeoJSONOverlay.redraw" redraws every texture immediately which can be slow, as well. It would be best to only draw a few of the textures and prioritize the ones that are currently visible.
Other than the two points above this should cover the "medium term" suggestion and remove the need for calling "deleteOverlay" and "addOverlay" to refresh the view.