3d-tiles icon indicating copy to clipboard operation
3d-tiles copied to clipboard

Question: Should tile geometricError be scaled by transform scale before SSE calculation?

Open shawn0326 opened this issue 7 months ago • 19 comments

I have some questions regarding the use of geometricError in 3D Tiles, specifically about how it should be handled in relation to a tile's transform.

Background: I am developing a WebGL-based 3D Tiles renderer. While reading the specification, I was not able to find a definitive answer to the following:

  1. Before calculating the Screen Space Error (SSE), should the geometricError of a tile be scaled by the scale component extracted from the tile's transform matrix?
  2. When creating 3D Tiles resources, is the geometricError already supposed to include any scaling from the transform? Or is the transform considered a separate scaling that must be applied later? (I realize this can be ambiguous, since a user might apply additional scaling to the root 3D Tiles node at runtime.)

The documentation about geometricError does not seem to clarify this point. Could you please explain the intended workflow, or point to any relevant part of the specification? Thank you!

shawn0326 avatar Jun 12 '25 02:06 shawn0326

This was raised in https://github.com/NASA-AMMOS/3DTilesRendererJS/issues/1176 and I see now that both Cesium source and a single line in the spec:

The transform property scales the geometricError by the largest scaling factor from the matrix.

However in both versions 0.0 and 1.0 the spec specifically states that transform should not scale the geometric error:

The transform property does not apply to geometricError—i.e., the scale defined by transform does not scale the geometric error—the geometric error is always defined in meters.

It should be noted that this was a breaking change and isn't noted in the CHANGES doc. Trying to track down where this came from I see https://github.com/CesiumGS/3d-tiles/issues/367 which references https://github.com/CesiumGS/cesium/issues/5330 and https://github.com/CesiumGS/cesium/pull/7411. From what I can tell the two original issues (5330, 7411) are issues about handling tile set root scaling correctly, not about internal tile transforms, which are independent. The 3DTilesRendererJS project has supported tile set root scaling from the beginning with no need to account for internally defined tile transforms.

As I see it this line has now made the spec inconsistent and more difficult to implement in a number of ways. First is that "geometricError" is now apparently not necessarily specified in meters and instead subject to an arbitrary scaling potentially resulting in inconsistent scales across the tile set. It's no longer clear what frame or unit "geometricError" is really specified in. Second is that the "largest scaling factor" is not necessarily clear when something like a skew transform is specified, which doesn't seem to be disallowed in the spec.

I know there may be some hoops but I would like to suggest reverting that change to the spec so that geometric error is not scaled by the internal tile transform and handling of the tile set root scaling can be left up to an individual implementation with a recommendation that the "largest" scale factor is used when axis-aligned scaling is performed. Cesium can adjust the implementation to account for root scaling which should address the above two linked issues this originated from.

gkjohnson avatar Jun 12 '25 03:06 gkjohnson

There are some subtle technical details and open questions, and everything that I say here is preliminary and has to be confirmed (cc @lilleyse for awareness).

Rather taking notes while trying to connect the dots (based on the info from the previous post) :

  • The issue https://github.com/CesiumGS/cesium/issues/5330 is from 2017, before 3D Tiles 1.0 was released
  • The corresponding fix in https://github.com/CesiumGS/cesium/pull/8182 is from September 2019, 'shortly' after 1.0 was released, but before 1.1 was released.
    • This means that CesiumJS has "always" implemented the approach where the geometric error is affected by the scaling
  • The issue https://github.com/CesiumGS/3d-tiles/issues/367 is actually from the day where 1.0 was published
    • The wording suggests that the intention was to apply the scaling to the geometric error even in 1.0. So that should have become revision of 1.0, where the spec should have said that the scaling should be applied. But apparently, there never was a revision of 1.0. This update only made its way into 1.1, via https://github.com/CesiumGS/3d-tiles/pull/571

My current interpretation is:

  • The scaling should be applied to the geometricError
  • It should have been applied in version 1.0 as well, but that never made its way into the spec
  • It became part of the spec in 1.1.
    • Unfortunately, it was not noted as a breaking change. (Maybe because ~"it had always been supposed to be like that"...? 😕)

@gkjohnson

From what I can tell the two original issues (5330, 7411) are issues about handling tile set root scaling correctly, not about internal tile transforms, which are independent.

In how far are the root and the internal transforms "independent"?

A guess is that this might refer to the fact that the current wording in the (3D Tiles 1.1) spec is ... ... hm .... it does not properly capture what is actually happening. It says

The transform property scales the geometricError by the largest scaling factor from the matrix.

Imagine these structures (in pseudocode):

Case A

Only the root has scaling

root :
  transform: scaleBy(6,6,6)
  children: [ {
      geometricError 10
  } ]

Case B

Only the child has scaling

root :
  children: [ {
      transform: scaleBy(3,3,3)
      geometricError 10
  } ]

Case C

The root and the child have scaling

root :
  transform: scaleBy(2,2,2)
  children: [ {
      transform: scaleBy(3,3,3)
      geometricError 10
  } ]

An the question (for implementors/clients) is: What is the runtime(!) geometric error of the child?

The specification only refers to the transform of the tile (!). This means that

  • In case A, the runtime(!) geometric error of the child would be 10 (wrong, should be 60)
  • In case B, the runtime(!) geometric error of the child would be 30
  • In case C, the runtime(!) geometric error of the child would be 30 (wrong, should be 60)

The point is: The specification only refers to the transform. This does not capture the transforms from the parent.

The spec also says

The transformation from each tile’s local coordinate system to the tileset’s global coordinate system is computed by a top-down traversal of the tileset and by post-multiplying a child’s transform with its parent’s transform ...

This transform does not have a name, but let's simply call it 'global tile transform' for now. Then the sentence about the geometric error should therefore be

The transform property scales the geometricError by the largest scaling factor from the matrix.

The geometricError is scaled by the largest scaling factor from the global tile transform

(This is what is currently implemented in CesiumJS, and the only thing that makes sense, with additional details disussed below)


First is that "geometricError" is now apparently not necessarily specified in meters and instead subject to an arbitrary scaling potentially resulting in inconsistent scales across the tile set.

Does this "inconsistency" refer to the previous example, and the fact that the definition only refers to the transform?

Or conversely: Do you think that this inconsistency would be resolved if the spec referred to the global tile transform instead?


It's no longer clear what frame or unit "geometricError" is really specified in.

This is generally true, and always bugged me. The claim that the geometric error is given in 'meters' does not make sense for things like point clouds, or a plain unit square that is once stored with a 128x128 pixel texture and as a higher LOD/child with a 256x256 pixel texture. I'd also like to give it a more specific meaning that is defined more clearly and strictly. But the given examples already show that it could be difficult to cover the vast variety of data structures that can appear in 3D Tiles and that may affect the "visual fidelity".


Second is that the "largest scaling factor" is not necessarily clear when something like a skew transform is specified, which doesn't seem to be disallowed in the spec.

This was also mentioned in the corresponding PR. One could argue that "skew transforms are rare in 3D Tiles", but ... that's certainly not sufficient for a specification. The fact that the (single-value) geometricError tries to capture something that refers to a 3D object already makes this difficult.

One definition that could make (or rather "have made") sense could be: "The geometricError is scaled by the determinant of the global tile transform (i.e. by the volume of the unit cube under that transformation)". But this is just a gut feeling. The definition cannot easily be changed, because it would change the scaling factor wildly compared to the current definition. For a uniform scale of (2,2,2), the factor is currently 2, but would become 8 by this definition. This would cause unpredictable behavior due to possible disagreements between producers and consumers of tilesets about the "correct meaning" of the geometric error...

(EDIT: Maybe "the cube root of the determinant" could do it? Just brainstorming...) (EDIT2: We have not yet talked about a scaling factor of (-2,-2,-2), which could cause a negative geometric error right now...)


An aside: There are many more questions related to the geometric error (I'll skip them here - it's already a wall of text). I once considered that it could make more sense to replace it by a simplification value, where 0.0 means "The highest detail that is available", and 1.0 means "The most simplified version that should ever be visible". But that could only go into some "3D Tiles 2.0", and would still need to be fleshed out, to make sure that the producing and consuming side have a common understanding of that.

javagl avatar Jun 12 '25 11:06 javagl

In how far are the root and the internal transforms "independent"?

To be clear when I say "root" I'm referring to the root transform in a target client, not the root as embedded in the 3d tiles json format. There's a meaningful distinction between the data and transforms captured in a specification and how a client application renders that and exposes control over it to a user. Once loaded, a 3d tiles tile set is just a 3d model with built in LoDs which can have additional transforms applied to it on top of those embedded in the file (just like glTF, etc). These additional transforms are what I'm referring to when I mention "root transforms" and allows the user to rotate, scale, translate etc the loaded tile set geometry as a whole. Eg "user loads tile set A, then scales it by a factor of 10 in the application". This factor of 10, of course, needs to be accounted for when calculating the tiles to load but this does not mean that the embedded geometric error values cannot be defined without tile transforms being applied.

This transform does not have a name, but let's simply call it 'global tile transform' for now. Then the sentence about the geometric error should therefore be

I agree that the spec is also under-specified here, as well.

Does this "inconsistency" refer to the previous example, and the fact that the definition only refers to the transform?

Or conversely: Do you think that this inconsistency would be resolved if the spec referred to the global tile transform instead?

It's referring to case B & C. In both of these cases two tiles (the root and the child) will have different scales meaning the geometricError in-file can no longer be considered to be of the same scale. Fundamentally it's not clear to me how geometric error can be defined as "being in meters" but then also multiplied by an arbitrary scale value, the final result of which should be in meters (or replace meters with some notional metric a application is using). It was consistent and clear before this scaling clause was added to the spec.

This is generally true, and always bugged me. The claim that the geometric error is given in 'meters' does not make sense for things like point clouds

I can understand that there could be a better, more inclusive way to describe geometric error but looking at this through the lens of photogrammetery, which it seems like 3d tiles was originally designed to consider, meters is a sensible and intuitive way of thinking about this value. Before I could think of geometric error as "the distance from the furthest point in this post-transformed tiled to the original surface". Now I don't know how to think about it.

(EDIT2: We have not yet talked about a scaling factor of (-2,-2,-2), which could cause a negative geometric error right now...)

Yes this also occurred to me.

There are many more questions related to the geometric error (I'll skip them here - it's already a wall of text)

Happy to have a separate conversation about this! I have some thoughts, as well, but I'll save it.


It seems like there are a lot of follow on implications that haven't been fully considered, which I understand the happens with these things. While I would prefer that this line be reverted, if it absolutely can't be done then I think some changes need to be made to the rest of the spec to make it consistent and understandable again. Specifically the "geometric error" value can no longer be referred to as being in meters (or any consistent scale or unit), it should be noted that the global tile transform is applied to the error in the "geometric error" section, and some new method for intuiting about what "geometric error" really means should be formulated.

As a single data point I'd never considered that geometric error should have the transform considered - likely in part because accounting for 3d scales in single value isn't obvious and it's referred to very consistently as being in a single scale throughout the spec.

gkjohnson avatar Jun 12 '25 12:06 gkjohnson

The distinction between the transforms and what you referred to as "root" is clear now. (In CesiumJS parlance, this "root transform" is usually referred to as the modelMatrix/"model matrix", to differentiate it from the root.transform in the JSON)

Eg "user loads tile set A, then scales it by a factor of 10 in the application". This factor of 10, of course, needs to be accounted for when calculating the tiles to load

I'm not sure whether this is evident, given the vagueness of the term and the resulting degrees of freedom for the implementors. But intuitively, when the geometric error is translated into a "screen space error", then the scaling factor of 10 makes the object larger on the screen, so somewhere, this factor has to come in. (The point is: When this is the scaling from the runtime application, i.e. from the modelMatrix, then the application could choose to simply scale the screen space error with this factor - but let's consider this as an implementation detail for now).

Fundamentally it's not clear to me how geometric error can be defined as "being in meters" but then also multiplied by an arbitrary scale value, the final result of which should be in meters (or replace meters with some notional metric a application is using). It was consistent and clear before this scaling clause was added to the spec.

For some details, I have to take a step back and wait for others to chime in (I have not been involved in the early versions of the spec). The scaling in general (as it was added in 1.1) sounded reasonable for me, given the issues that you linked to, but I did not have all details on the radar back then.

Before I could think of geometric error as "the distance from the furthest point in this post-transformed tiled to the original surface".

This is certainly the meaning that people had in mind, as also suggested in the 3D Tiles Reference Card:

Image

The more formal idea behind that was likely that of a Hausdorff distance, which is a metric in R³, and where the unit of meters makes sense. But the concept of "meters" is not sensibly applicable to other forms of simplification. (You could have a 1x1 meter, flat unit square with 2 triangles, or with 20000 triangles, and they probably shouldn't have the same geometricError value...)


While I would prefer that this line be reverted, if it absolutely can't be done then I think some changes need to be made to the rest of the spec to make it consistent and understandable again.

It can not be changed "just so" (as you noted, it would be a breaking change). Even if, hypothetically, there was a new spec version, then we'd like to avoid a case where someone has to do some if (version=="1.1") doThis(); else doThat();. But regarding the specific points:

Specifically the "geometric error" value can no longer be referred to as being in meters (or any consistent scale or unit),

It might be possible to clarify that by roughly saying that ~"the value should be in meters after it was scaled". But that's just a first shot (a "patch" for the spec text, trying to not introduce breaking changes). This would have to be elaborated, and it might be that this does not make sense in all cases (non-uinform/shear/negative scaling...).

it should be noted that the global tile transform is applied to the error in the "geometric error" section,

This is in line with the clarification that I suggested above (of changing transform to 'global tile transform'). Whether or not that is possible or would be a ~"normative change" has to be discussed.

and some new method for intuiting about what "geometric error" really means should be formulated.

I'd also like to be more specific here, but given the open discussion points, I could hardly describe it more formally than as "a value that governs the refinement process", which is not enough.

In any case, I'll consider to open an issue with some further thoughts about these aspects.

javagl avatar Jun 12 '25 13:06 javagl

I have to take a step back and wait for others to chime in

Agreed. I'll clarify a few final points.

This factor of 10, of course, needs to be accounted for when calculating the tiles to load

I'm not sure whether this is evident, given the vagueness of the term and the resulting degrees of freedom for the implementors

Agreed that this is up to the implementation. But if an application allows for scaling and does not reasonably account for it in the error calculations at all then users will likely complain 😁

You could have a 1x1 meter, flat unit square with 2 triangles, or with 20000 triangles, and they probably shouldn't have the same geometricError value...

This I understand but I think it's an orthogonal topic (which I'm happy to discuss separately). The point is that geometric error was defined in a consistent scale, whatever that is or should be. For the sake of our discussion it is meters because that's what is currently stated in the spec.

It can not be changed "just so" (as you noted, it would be a breaking change)

As I see it there are three reasonable interpretations of the spec if you have different scales per tile since there are different readings of geometric error throughout the spec now:

  1. No scaling is applied since "geometricError" is explicitly specified to be in meters.
  2. Only local tile transforms are applied to geometric error.
  3. Multiplied "global" tile transforms are applied to geometric error.

No matter what the decided change is this could be breaking for some case.

gkjohnson avatar Jun 12 '25 13:06 gkjohnson

Hi! I'm a maintainer of py3dtiles and giro3d which uses Nasa's 3Dtilesrenderer under the hood. I happened to bump into this issue on py3dtiles exactly today so I'm chiming in :-)

What I can say is that py3dtiles currently produces a "global" geometricError, that shouldn't be scaled by the viewer. When we initially implemented that, it was at the time when the spec (still draft maybe?) said that geometricError shouldn't be scaled. Is seemed logical at the time, partly because of the fact that it was not self-evident how it should be scaled (this is something that appears in this issue as well).

I'd prefer it to stay that way, partly because the simpler and easily understandable a spec is, the easier and more correct the implementations are. Scaling a geometricError would always feel weird and it'd make producing tiles more difficult.

For me, geometric error looks like a notion that makes only sense globally in a tileset (it's the relative value of geometric error between parent and children tiles that has the most meaning).

We'd adapt to the final decision taken of course (and there might be things that I've missed that could make a good case for the scaling), but that would indeed be a breaking change for us.

autra avatar Jun 13 '25 18:06 autra

Determining "the best" or even "a good" geometric error certainly is something that involves lots of engineering efforts (and experiments and experience). One can get it right for one data type (say, photogrammetry), but whatever strategy is used there, it may be hard to translate that to others (like point clouds). And eventually, there's the chance that whether an approach works well or not also depends on the viewer/client.

I could probably try to dive into the py3dtiles code, to see how it is generally computed. But when you say

produces a "global" geometricError, that shouldn't be scaled by the viewer.

I'm wondering about two things:

  • Do you generate tile transform matrices that actually do involve scaling? (That's not self-evident. It may very well be that the input geometry is just sliced-and-diced into pieces, and the tiles do not use a transform at all, or only for translation/rotation (geo placement))
  • Do you generate tile transform matrices where different tile transforms within a tileset(!) use different scaling factors? (I could imagine that this is even more rare...)

When the tile transforms do not involve scaling, then the process is obviously not affected by any change here.

When the tile transforms do involve scaling, then it becomes tricky. Depending on which scaling factors they use, and how the clients are interpreting the current spec text (which requires scaling), there already may be an ambiguity and cases that can cause undesired results.

The main ambiguity is whether the clients are using the transform of a single tile, or the 'global tile transform' that involves that transforms of all ancestors. (As discussed earlier: I think that it should be the 'global tile transform', but that remains to be confirmed)


(I initially posted this with different number, but that didn't make sense. I tried to fix it, hopefully it makes sense now)

Just trying to get an idea: There could be a tileset structure like the following:

  • root tile with scaling 2 and geometric error 10
    • child tile with scaling 3 and geometric error 5 (+ grandchildren, omitted here)
    • child tile with scaling 8, and geometric error 5 (+ grandchildren, omitted here)

For clients that do not scale, these values make sense: The child geometric error is smaller than that of the parent. At some point, the refinement kicks in, and loads the children. All good.

But when clients do scale, then there are more cases.

1. They are scaling only with the transform of the respective tile:

  • root tile with scaling 2 and geometric error 10, becomes 20
    • child tile with scaling 3, and geometric error 5, becomes 15
    • child tile with scaling 100, and geometric error 5, becomes 500

That's wrong: The geometric error of one child would be larger than that of the parent, which would usually break the refinement process.

2. They are scaling with the 'global tile transform':

(Using different values, trying to illustrate the point:)

  • root tile with scaling 2 and geometric error 1000, becomes 2000
    • child tile with scaling 3, and geometric error 5, becomes 30
    • child tile with scaling 100, and geometric error 5, becomes 1000

Even though the root has a larger geometric error than the children, the fact that geometric errors of these children are so vastly different could still cause unexpected behavior: The one that has the geometric error of 1000 will be refined much earlier (into the grandchildren) than that with the geometric error of 30.


So... no conclusion for now 😬 I'm just trying to think through the options, in the hope of figuring out what might be the best option for going forward here.

javagl avatar Jun 13 '25 18:06 javagl

Do you generate tile transform matrices that actually do involve scaling? (That's not self-evident. It may very well be that the input geometry is just sliced-and-diced into pieces, and the tiles do not use a transform at all, or only for translation/rotation (geo placement))

yes.

Do you generate tile transform matrices where different tile transforms within a tileset(!) use different scaling factors? (I could imagine that this is even more rare...)

No, the scale (and any transform actually) is only on the root tile. I think we do that to gain precision on pointclouds that have a large geometric extent.

And I confirm we do currently generate geometricError assuming client won't apply any scaling to it. In other words, py3dtiles always generates geometricError in global units.

Again, I think it's the most practical, because it's the easiest and fastest to compute client side.

To further give context, the viewer I use (giro3d), actually uses @gkjohnson's project under the hood, so won't apply any scaling either.

autra avatar Jul 17 '25 13:07 autra

(This is still on my TODO list, and (despite being pushed down) not yet forgotten)

As detailed above, the intention was to apply the scaling factor to the geometric error. The fact that this should already have been the case in 3D Tiles 1.0 (but wasn't called out explicitly) makes it a bit more difficult.

The current intention is, roughly

  • Add a note to the change log, pointing this out. This will roughly say that

    1. the geometric error should be scaled with the global tile transform, and
    2. that this should also have been done like that in 3D Tiles 1.0.

    Yeah, the latter is not ideal, and only an attempt to mitigate the potential for inconsistencies...

  • Add an implementation note in the specification of 1.1, saying that

    1. the unit of the geometric error should be 'meters, after the scaling was applied'
    2. the scaling factor should be the absolute value of the largest scaling factor

(The last point is to be confirmed with @lilleyse , but ... not using the absolute would allow negative geometric errors, one way or another, and we certainly don't want to do that...)

@kring From quickly skimming over the cesium-native code, it looks like this already does scale the geometric error. The case of negative scaling factors may have to be taken into account there...

javagl avatar Jul 17 '25 14:07 javagl

I'm still in a slight disagreement with this choice. I'd still prefer to not scale, and I still think the earlier version of the spec should have been honored. Also, not scaling would avoid any issue with negative geometricError. Why do you think it's better to scale it @javagl?

That being said, the most important for me is to have a clear interpretation of the spec, whatever the final decision is 👍🏻

autra avatar Jul 17 '25 14:07 autra

I'm confused - why do we not seem to be considering removing the line from the specification? It hasn't been able to be stated why the addition is beneficial and, in fact, we've only discussed why it's addition has made the spec more unclear and difficult to interpret in this thread. There is nothing that this line has enabled that couldn't be done previously. It's hard to not see this as Cesium's implementation decisions leaking into the spec definition.

To take a step back - the reality of this is that this spec was used for 6 or 7 years as a 0.0 and 1.0 version without this line and never caused an issue because the spec was internally consistent. It was then added as a breaking change into a non-major release without any proper notification or justification. And, again, only serves to make the spec more difficult to interpret. Reading through the above issues again there is no apparent rationale behind the change. The rest of the specification was clearly written without it in mind.

To reiterate: for any unfamiliar reader of the spec the definition of "geometricError" cannot be clear right now. "GeometricError" is defined as being in meters at least three times throughout the spec which, again, is fundamentally incompatible with the notion that the value is scaled by the transform matrices. The fact is that the spec is currently "broken" in it's current state and it can be "unbroken" by changing the definition of "geometric error" throughout the document or removing this line, of which I clearly prefer the latter since it's more intuitive and doesn't require changing the definition of "geo error" which has been consistent for 10 years (!!). The solutions proposed here feel like they're bending over backwards to include a line that isn't needed. I just ask that someone please justify why the line deserves to be the document and what benefit it's bringing.

gkjohnson avatar Jul 18 '25 08:07 gkjohnson

Why do you think it's better to scale it @javagl?

(I don't think that - details below)

It hasn't been able to be stated why the addition is beneficial

I understand the confusion (and ... let's call it 'frustration'). The only "justification" for that change that I found was in the issue where this initially came up (linked above, https://github.com/CesiumGS/cesium/issues/5330 ). The description is a bit vague, saying that "bad things happen", and guesses about the reasons for that. That's from before 3D Tiles 1.0, and long before I was involved in 3D Tiles. It only came up again during the finalization of 3D Tiles 1.1 (with what I think to be one of my first comments asking about some clarification). But at this point, the discussion of whether the scaling should be applied at all already appeared to be settled, with the answer being "Yes, it should be applied".


A broader interlude:

Regardless of whether the scaling is applied or not, the geometricError is one of my pet peeves. I already pointed out some of the issues with this concept in my previous comments here.

Or to that specific point:

To reiterate: for any unfamiliar reader of the spec the definition of "geometricError" cannot be clear right now.

I think it may be even less clear for readers who are more familiar with some concepts. Only those users will raise these questions that are really hard to answer: "Hey, what is the 'error in meters' for a simplified point cloud, or for a simplified texture?", or the questions that really go into the details, namely exactly whether that error should be affected by scaling factors, and if so: how(!).

The tl;dr of my concerns could be: The geometricError claims to represent a simple concept for LOD selection, but reality is not simple. And I should probably open an issue about that, just to lay out all the other reasons for why it is not simple. (The question about the scaling is only a tiny detail of all that...)


Back to the specific point of "scaling it" vs "not scaling it":

I cannot say what is "better", because there is no univocal definition of this term. I can only try to think through certain application cases, identify pros and cons, and then consider the trade-offs. In many cases, these trade-offs will be between 1. simplicity and 2. genericity, and between implementation efforts for 1. producers and 2. consumers.

One aspect that has been mentioned here is the implementation complexity for consumers, and that's a valid and important point: Extracting the scale factor of a tile is more effort than just not doing this (and often, simpler is better). For the producers, I could imagine that having the option to apply a scaling factor to 'everything' by just setting the root transform simplifies things a lot. Roughly: Imagine you receive some CAD data where the geometry is stored in millimeters. You have a sphere, approximate that with a cube, and compute the geometric error as the distance between the cube corner and the sphere. Fine. Eventually, you apply a root tile transform with a uniform scaling of 0.001, to convert the geometry and the geometric error into "meters". Of course, the producer could "bake" all unit conversions into the data. As I said: It's a trade-off for the effort on both sides.

In terms of genericity vs. simplicity: Some of the comments above already tried to go through some "What if...?" scenarios. These roughly aimed at "How versatile is a certain concept, what can be modeled with this concept, exactly, and what are possible limitations?". (The term "limitations" has a negative connotation. It should not. A specification is just a collection of limitations, laying out what can and what can not happen). For example: The issue at https://github.com/CesiumGS/cesium/issues/5330 also talked about runtime modifications of the tileset, which raises a new bunch of questions. Given that 3D Tiles is always supposed to be in meters, runtime modifications of the transform should usually not involve scaling. When it is applied, then the question is whether this is done by setting the transform matrix of the root tile of the (in-memory, runtime) representation of the tileset, or whether it is something that is "independent" of the tileset as it is defined in the JSON - namely, something like the modelMatrix in CesiumJS. (Whether or not this should affect the geometricError is something that isn't made clear in the specification either).


The bottom line:

I just ask that someone please justify why the line deserves to be the document and what benefit it's bringing.

I don't see a compelling reason to require the scaling. As I said: Not applying the scaling would shift more responsibility for "picking the right value" to the producer, but that could be OK.

But in any case, it would be difficult to introduce a change in the 1.1. spec that would be 'breaking', even if this aimed at mitigating the problems that are rooted in an undocumented (and maybe even "unnecessary") breaking change between 1.0 and 1.1.

(Ping @lilleyse about further thoughts here)

javagl avatar Jul 18 '25 14:07 javagl

Thanks for bringing this up @shawn0326, and @gkjohnson @autra @javagl for the discussion. I don't have much to add that hasn't already been said.

If we reverted https://github.com/CesiumGS/3d-tiles/pull/571 (which increasingly seems like the right thing to do), we would need to make these changes on the Cesium side:

  • Update the 3D Tiles Location Editor in Cesium ion. When scaling is applied, it would need to be baked into each tile's geometric error in addition to the root tile transform. This would involve updating external tilesets.
  • Update CesiumJS and Cesium Native to not scale geometric error by the global tile transform, only by the modelMatrix
  • Update Cesium tilers when --scale option is used

Unfortunately, this breaks assets that depend on the old behavior. But maybe this isn't super common.

lilleyse avatar Jul 18 '25 16:07 lilleyse

Unfortunately, this breaks assets that depend on the old behavior. But maybe this isn't super common.

It's hard to give an estimate of how common it is that tilers generate data in a way that explicitly and dedicatedly only works (or "makes sense") when the scaling is taken into account. But the discussion here reminded me of a related point: The specification says that ...

Generally, the root tile will have the largest geometric error, and each successive level of children will have a smaller geometric error than its parent, with leaf tiles having a geometric error of or close to 0.

This does not say explicitly whether it refers to the "error after scaling", but that's the only thing that would make sense here.

More generally; We'll have to check all appearances of 'geometric error' in the specs, and probably clarify that it would usually refer to the "error after scaling".

The reason for pointing that out: The 3d-tiles-validator is currently actually not taking the scaling into account. Most of the time, this does not matter: The validator only checks whether the geometric error is non-negative. But there is one place where it does matter, namely exactly for the part quoted above. This originally even was counted as an error, and even though it was later changed to be a warning, it still will count this as an issue:

https://github.com/CesiumGS/3d-tiles-validator/blob/f2af4f07889bac90167c64686098a24db55a8a35/src/validation/TilesetTraversingValidator.ts#L429

(So depending on the decisions here, this might have to be adjusted, to use the scaled geometric error)

javagl avatar Jul 19 '25 09:07 javagl

@javagl

I think it may be even less clear for readers who are more familiar with some concepts.

I understand there are concerns about the definition of "geometric error", and it's been mentioned earlier in the thread, but as you suggest this is a separate discussion and one I'm happy to participate in with a separate issue.

The description is a bit vague, saying that "bad things happen", and guesses about the reasons for that. ... The issue at CesiumGS/cesium#5330 also talked about runtime modifications of the tileset, which raises a new bunch of questions. Given that 3D Tiles is always supposed to be in meters, runtime modifications of the transform should usually not involve scaling.

The referenced issue only mentions scaling the root, for which the behavior should be very clear. If root scaling isn't accounted for then it could result in the issues described (scaling large but disappears when it should be visible or displays at a lower res). The glTF spec also specifies that all world-space distances are in meters (ref 1, ref 2) but no one is confused about what should when you scale a glTF model at runtime. When a glTF model is scaled by 2 or by a factor of 100 because the loading application requires 1 unit = 1 cm then the lighting calculations need to account for this scale (eg embedded light intensity needs to be adjusted). This is understood behavior if you want to correctly replicate the intended look of the model at a different scale. Considering this a point of issue seems to be reaching a bit. If a 3d tile set is loaded into an application operating in something like cm or is scaled otherwise then of course this needs to be accounted for in tiles traversal.

But in any case, it would be difficult to introduce a change in the 1.1. spec that would be 'breaking', even if this aimed at mitigating the problems that are rooted in an undocumented (and maybe even "unnecessary") breaking change between 1.0 and 1.1.

There is no avoiding a breaking change at this point. Either the definition and wording around the value in "geometricError" is changed throughout the spec so it is no longer in "meters" or this line is removed. Either way some portion of the spec after one of these changes will be incompatible with the current version regardless of which way you were interpreting it previously.

This does not say explicitly whether it refers to the "error after scaling", but that's the only thing that would make sense here.

This line was written without scaling in mind when the specification explicitly stated that geometric error is not scaled by the transform. So yes; this line is referring to the explicit value in the "geometricError" field which was always in world units when this line was written.

@lilleyse

Unfortunately, this breaks assets that depend on the old behavior. But maybe this isn't super common.

Relating to the previous point on breaking changes - it's unfortunate but inevitable that some assets break no matter which change is made. The couple ways I can think of to mitigate this are for runtimes to check the version of the tile set (thank you for including a version field in the format 🎉) and choose whatever the runtime feels to be the most appropriate behavior for its user base if the version falls within this range of this ambiguous specification. And / or add a flag per-tileset to allow a user to force one behavior or the other if they're aware. It's not perfect but being able to rely on the version number is at least a step better than having to try to infer what to do from arbitrary fields, as is done with TMS, to determine behavior.

It's too late for existing tile sets but it might be smart to add an optional "tileset.asset.generator" field to the tile set schema like glTF does (and possibly an explicit generator version number) so these kinds of generator-specific asset issues can be more easily addressed in the future if need be. Eg if we knew an asset was generated with "Cesium Tool v2.3" and the tool used local-space geometric error in pre-version 3.0 then this inconsistency between assets would be less of an issue to deal with, though still undesirable.

gkjohnson avatar Jul 19 '25 23:07 gkjohnson

The referenced issue only mentions scaling the root, for which the behavior should be very clear.

It specifically mentions the root, yes. And I don't want to make the discussion harder than it already is. But I think that it is crucially important to be absolutely clear about the vocabulary, to avoid further quirks or inconsistencies like the one that we're currently dealing with. In an earlier comment, you said

To be clear when I say "root" I'm referring to the root transform in a target client, not the root as embedded in the 3d tiles json format.

But from the context, I'm about 99% sure that the "root" in the linked issue referred to the actual root tile that is stored in the tileset JSON!

So just to confirm: Based on that quoted comment, I assume that when you say

If root scaling isn't accounted for then it could result in the issues described

then you do not refer to the root from the tileset JSON, but to "the thing above that, in the client application" (e.g. the modelMatrix in CesiumJS), is that correct?

Assuming that this is all correct: Yes, of course, definitely! When the client does something with the loaded tileset (like scaling it with the modelMatrix), then the client is responsible for doing this in a way that does not "break" its own expectations or assumptions.

Conversely, when an application allows editing the actual contents of a tileset JSON file (like the "Tileset Location Editor" does), then things become tricky. Users can edit the root.transform with that. But in order to apply this modification consistently, this application would, in theory, also have to adjust the geometricError value of each tile in the tileset JSON accordingly.

There is no avoiding a breaking change at this point.

I agree. The goal is now to iron out the kinks in a way that does not have too severe undesirable side effects, and as precisely and clearly as possible, to solve this once and for all.


EDIT:

It's too late for existing tile sets but it might be smart to add an optional "tileset.asset.generator" field to the tile set schema

I agree, and already suggested that. I think it would be undesirable to use something like this for any form of ~"generator-version specific runtime behavior". But I think that it is important to have this field for many reasons.

javagl avatar Jul 20 '25 10:07 javagl

But from the context, I'm about 99% sure that the "root" in the linked issue referred to the actual root tile that is stored in the tileset JSON!

This is a Cesium-specific issue and an implementation detail. I also don't agree that the run time nodes should be considered the "same node" once loaded just because the in-file hierarchy maps well to the runtime model. I've made it clear throughout the thread that an application should be loading a file and displaying it as intended in the file and any runtime modifications done to the content need to be properly accounted for. If an application allows for overwriting the scale value defined by the loaded file (eg setting the initial scale from the in-file defined 2.0 to 1.0) and nothing is done to account for that then it's a bug in the application if the intent is to accurately reflect the contents of the file. The 3DTilesRendererJS project chooses to provide a parent node above the root for the user to transform but providing a node derived from the root in the file with the same transform is just as valid as long as it's handled correctly.

Conversely, when an application allows editing the actual contents of a tileset JSON file (like the "Tileset Location Editor" does), then things become tricky

Please be specific. Hand-wavy phrases like "things becomes tricky" imply that the problem isn't understood and stalls the discussion. This is something Cesium needs to figure out how to handle themselves but it's not tricky - these are all the same operations we're expecting to perform at runtime, anyway, so of course they can be run offline but the results can be more accurate. In fact barely anything changes outside the uniform scale case.

Yes if a uniform scale is applied to the root then all geometric error values in the file need to be scaled accordingly assuming geometric error is in world space - this had to happen at runtime assuming you're using the "scaled by transform" model (or scaled to whatever frame the operations are happening in). But if a non-uniform scale is applied then all the geometric error values need to be calculated from scratch since that scale can change the value of geometric error in world space regardless of how it's calculated (max, average, etc) in both cases. Alternatively you can naively scale the geometric error in the file by the largest dimension, as it's been suggested should happen in the runtime applications, though Cesium's wording for the sse option says "maximum" screen space error so the most correct thing to do is to recalculate it from scratch either way when scales are adjusted in a non uniform way.

I feel some of these questions have devolved into minor and in the weeds topics and I feel it's fallen on me to answer a lot of them to keep this moving. Some of the issues raised feel self evident if thought through a bit and I'd appreciate if further issues raised come at least with an attempt to answer them. At this point I'd like to understand what the process is for getting the specification fixed because I'm concerned that this won't go anywhere otherwise.


I think it would be undesirable to use something like this for any form of ~"generator-version specific runtime behavior"

It's of course undesirable that doesn't make it unnecessary, as evidenced by this issue. Cesium is already trying to do this with TMS with fewer guarantees. I'm not sure why it hasn't been added already but I've made a PR to get it included.

gkjohnson avatar Jul 26 '25 00:07 gkjohnson

I'm not sure how to "be specific" and not "devolve into minor and in the weeds topics". I could write a whole list of points that have to be considered here, where the problem space is certainly not well understood, and where it's difficult to find a solution that works for all cases (that's what I summarized as "tricky"). In doubt, we could discuss this via mail. I just thought that it might be worth mentioning that the spec change here seems to be result of an issue that was related to the Tileset Location Editor, even though there are not many further details about the reasoning behind the change.

But as you said: This is an issue for Cesium, and Cesium has to sort this out. None of that is relevant for other renderers.

The possible spec change is highly relevant, though. And not only for renderers, but also for producers of tilesets.

The steps for updating the spec are, roughly:

  • Revert the change from https://github.com/CesiumGS/3d-tiles/pull/571 . This involves carefully checking whether any other place of the spec has afterwards been updated (or newly written) and where the assumption that the geometric error is scaled comes into play. We have to avoid any other inconsistencies or contradicting statements here.
  • Updates: (some from the comment above)
    • Update the three.js based renderer, CesiumJS and cesium-native to not scale the geometric error
      • This may, in the most shallow form, literally take a few seconds, by commenting out the line // ge *= scale;. Each implementation can think about further precautions. It's probably a bit of a stretch, but at least in a side-track, I'd think through the viability of something like if (parentGe > childGe && scaledParentGe < scaledChildGe) log("May be wrong...");...
    • Update all known tileset generators in cases where they explicitly assign the GE with the assumption that it will be scaled...
      • This should be rare, but it may be hard to identify these places
    • Update the 3D Tiles Location Editor
      • This is a challenging one. Right now, it can just set the root.transform directly. In the future, this will not be possible. When the "scaling" slider is dragged from 1.0 to 2.0, it will have to traverse the whole tileset(!), on every change event(!), and adjust the geometric errors (and whether or not these geometric errors are then properly picked up by CesiumJS is everything but clear - CesiumJS is not designed for such "runtime modifications"...). In any case, it will not be possible to do this "in real-time"/"interactively". How the change will/should affect external tilesets is not clear either. When updating the 'owning' tileset, ion can write out the new tileset.json. When the part that belongs to an external.json is modified, it is not clear whether this change should (or even can be) written into that file.
  • Somehow communicate this change, prominently.
    • This involves some ALL CAPS statements in the changelog of the spec. But it should also be prominently pointed out in the changelog of CesiumJS and the tilers. There is no mechanism of "broadcasting" this change to existing producers or consumers of tilesets, and inform them that they have to update their code. Such a mechanism is only implicit, via the last point:
  • Consider a new 'hotfix' release of the spec, maybe "3D Tiles 1.1.1". This could either be only on GitHub, or even on OGC. (The latter would trigger a whole new chain of formal procedures and events...)

@lilleyse Not sure if I missed something here.

javagl avatar Jul 27 '25 10:07 javagl

@gkjohnson this fix will most likely be included in 3D Tiles 2.0, but no timeline on that yet.

lilleyse avatar Jul 28 '25 13:07 lilleyse