Unable to produce NASA 3d-tiles-renderer compatible datasets
Input file 1 : multi-textured-with-crs
Input file 2 : rock
After conversion with obj2tiles, the input file "rock" produces a working 3DTILES dataset, compatible with the NASA 3d-tiles-renderer.
But in some other cases, like with the "multi-textured-with-crs" input file, the renderer refuses to display some tiles randomly.
I think it's linked to the geometric error computation, but after many attempts to modify this parameter, the problem still occurs.
Do you have any idea, please?
Could it be that you are hitting the tile cache limit (lruCache) of the viewer? can you open some kind of debug view where used memory is displayed? I had the similar problem once and this was the culprit. Tiles were randomly rendered in the FCFS manner. As soon as the cache limit was filled, the rendering stopped until some tiles were removed from it to make space for new ones.
Hi @dknezevic,
I am working with @loicroybon to identify this problem. I can’t identify the blocking element, I tested several things:
- The lruCache limit is far from being reached (1% of the default limit on average).
- The value of GeometricError, which is quite quickly at 0, given the number of LOD generated (2).
- To force the ADD mode instead of REPLACE, we can see an improvement in the file’s path (it doesn’t make sense with use, we agree)
- To compare the number of loaded tiles against the theoretical maximum number of tiles.
Two video captures were taken using:
- The first one uses 3DTilesRendererJS without any modifications. I selected the SCREEN_ERROR colorMode to highlight its operation, During navigation, tiles that are in the FOV do not display or re-display as expected.
https://github.com/user-attachments/assets/d1362755-c7fa-4b49-964f-f970ee99c5b1
- The second way involves displaying the maximum amount of information and trying to force passage through child nodes/force ADD/etc.
https://github.com/user-attachments/assets/96df7fac-9bc3-4e56-b563-37b7d8baebba
Could you check the used memory and see it you reach limits? if your limits are reached, new tiles will not load until some of them are unloaded. you can add memory details similar to this when using 3DTilesRendererJS:
- create placeholder for stats
const showStats = true; // Set to true to show stats overlay
if (showStats) {
stats = document.createElement('div');
stats.style.position = 'absolute';
stats.style.top = '50px';
stats.style.left = '10px';
stats.style.color = 'white';
stats.style.padding = '4px';
stats.style.background = 'rgba(0,0,0,0.5)';
document.body.appendChild(stats);
}
- in your animate() functon add this to populate the stats
if (showStats) {
// Count loaded vs active
const loaded = tilesRenderer.lruCache.itemList.length;
const active = tilesRenderer.group.children.length;
const bytes = Math.round(tilesRenderer.lruCache.cachedBytes / (1024 * 1024)); // Convert to MB
// Update overlay
stats.innerHTML = `
<strong>3D-Tiles Stats</strong><br>
Loaded Tiles: ${loaded}<br>
Cache Size: ${bytes}<br>
Active Tiles: ${active}
`;
}
- adjust lruCache size (three ≥ 0.166.0)
tilesRenderer.lruCache.maxBytesSize = 1.0 * 2 ** 30; // ~1 GiB (default 0.4 GiB)
tilesRenderer.lruCache.minBytesSize = 2.0 * 2 ** 30; // ~2 GiB (default 0.3 GiB)
Try to play with thee memory limits a little. E.g. set it to a very high value to see if everyting will load.
OK, increasing the value of lruCache.maxBytesSize allows different scenes to be displayed correctly, thanks ! It's possible that I altered too many variables at the same time to avoid obtaining this result before. I am currently highlighting additional informations that may be usefull for the discussion and help us to understand where came ou issue. I will provide an update tomorrow with the informations I have.