meshoptimizer icon indicating copy to clipboard operation
meshoptimizer copied to clipboard

Attribute-aware error metrics for simplification

Open fstrugar opened this issue 5 years ago • 14 comments

Hi! I'm playing with https://developer.nvidia.com/orca/amazon-lumberyard-bistro dataset and meshoptimizer and I've noticed this particular failcase related to the way the corners were authored.

For example, here is the original chair mesh: image

Notice the rounded corners with shared vertices. They survive the first pass of meshopt_simplify to half the number of triangles fine: image

However, once the triangles between two sides facing at 90deg get folded and the sides start sharing the vertices, the vertex normals can no longer be correct: image

What would be a solution to (automatically) preventing this?

I was thinking of adding additional custom skip code in the 'pickEdgeCollapses' loop if angle between vertex normals is above certain threshold but I'm sure there's a better/simpler solution, perhaps already there? :)

(instead of preventing collapse, could also allow it but duplicate verts so normals aren't shared?)

Thanks for the great library!!

fstrugar avatar Jun 23 '20 14:06 fstrugar

Yeah, so there's a few ways to fix this.

One is to discard and recompute normals post-simplification, possibly splitting vertices when the crease angle is too sharp. This works around the problem in a way, but of course it's not very convenient.

Another one is to factor the normal delta into the simplification as an extra error. This can be done by comparing the normals alongside the edge that's considered for collapse, or it can be done by introducing normal into the quadric weight. It's on my list to experiment more with this, there's a simplify-attr branch in this repository from my last attempt but when I worked on this at the time it became clear that this isn't very simple so I decided to take a break and think about this more.

This isn't implemented right now though, and it's definitely good to address this but I'm not sure what the best solution is, since ideally it's not just the normals that need to be taken into account, and balancing ease of use, performance and quality here is tricky...

zeux avatar Jun 25 '20 03:06 zeux

I also encountered this problem from the

image

mesh in https://github.com/zeux/meshoptimizer/issues/206#issuecomment-748478183

fire avatar Dec 24 '20 20:12 fire

@zeux Would you be able to look at this?

Thanks for your amazing work on meshoptimzier.

fire avatar Dec 29 '20 18:12 fire

I believe this part of the comment above accurately reflects the plan here:

Another one is to factor the normal delta into the simplification as an extra error. This can be done by comparing the normals alongside the edge that's considered for collapse, or it can be done by introducing normal into the quadric weight. It's on my list to experiment more with this, there's a simplify-attr branch in this repository from my last attempt but when I worked on this at the time it became clear that this isn't very simple so I decided to take a break and think about this more.

Since this issue is still open, you can assume I'm going to look into this at some point in the future; when exactly this point will be I can't say, as this requires some further research on how to best integrate the attribute metrics with geometry metrics in a way that is reasonably easy to tune once instead of having to tweak weights per model.

zeux avatar Dec 29 '20 21:12 zeux

Factor the normal-delta into the simplification as an extra error by comparing the normals alongside the edge considered for collapse.

The metric can be done by introducing normal into the quadric weight.

Since comparing attributes is not simple, would there be any other approaches?

I wanted to look into this, but a bit lost.

Edited:

I tried using your attribute branch and didn't see any major problems.

https://github.com/fire/meshoptimizer/tree/simplify-normal-attribute

fire avatar Apr 08 '21 22:04 fire

I tried using your attribute branch and didn't see any major problems.

Yeah, it needs more work to be production ready wrt metric, I think the branch predates some geometric improvements - and also needs some interface and optimization work. FWIW I plan to resume this in the next few weeks.

zeux avatar Apr 09 '21 00:04 zeux

Is there's a better way to define normal being close enough? It seems to block optimizations of any curved surface. Only flat planes get optimized.

Not sure how to allow the first pass of decimation in the chair example and then block the ones that fail.

I wish there was a way to optimize the indices with the normals on the second try.

My thoughts are using quad remeshing or isotropic remeshing, but that has a lot of work, but it gives the optimizer more room to work.

Notes:

  • https://github.com/avaxman/Directional
  • https://github.com/gradientspace/geometry3cpp

fire avatar Apr 09 '21 15:04 fire

It seems to block optimizations of any curved surface.

That's because the metric needs work I believe; the code in that branch right now is very challenging to tune properly, which is part of why this hasn't been integrated yet. I'm not aware of existing research that's more promising than the general approach used there but since that code isn't production ready it can have all sorts of issues, and likely requires taking a path that hasn't been explicitly documented in academia (at least it was the case for geometric error, where the approach that meshoptimizer uses is inspired by prior research but doesn't follow it precisely).

Remeshing is orthogonal to simplification - it can definitely make topology-aware simplification easier, but doesn't solve the problem by itself and you still need attribute awareness within the simplifier to solve significant attribute distortion from this thread.

zeux avatar Apr 09 '21 16:04 zeux

I'll do some literature searches for vertex normal merge, collapse, and flip metrics.

If you have any keywords I can search that'll help too.

Edited:

Will list some promising papers:

https://dl.acm.org/doi/pdf/10.1145/2425836.2425911

Edited:

I'm going to use the 6 element truncated 3x3 orientation matrix to store the normal. This uses 6 attributes. It seems to work ok.

fire avatar Apr 09 '21 17:04 fire

https://github.com/godotengine/godot/pull/47764

@zeux

Can you take a moment to see if this is legitimate, the Godot Engine contributors had concerns about applying patches on top of meshoptimizer that aren't merged.

I wanted some motion on this topic.

Thanks!

fire avatar May 22 '21 15:05 fire

Considering the title of this issue:

I have voxel meshes which can contain encoded texture splatting parameters in extra attributes (repurposing color and UV) in additional vertex arrays.

Problem: removing vertices in that scenario directly reduces quality even if geometry is preserved. Simplification only seems to care about vertex positions, which means there should be more information to give meshoptimizer, or some way to customize the comparison between vertices. Tangent problem: my meshes use multiple streams (structure of arrays), but the current API seems to only take one.

I'm wondering if simplification is actually suited in that situation, otherwise it doesnt sound actually... simple (the kind of data I'm storing is packed sets of indices and weights).

Does this match the current issue or should I open another?

Zylann avatar Jul 04 '21 15:07 Zylann

Yes that’s the same problem as highlighted in this issue. Attribute aware simplification will be exposed as a separate function with separate attribute stream inputs.

zeux avatar Jul 04 '21 16:07 zeux

With risk of stating the obvious, my suggestion would be to:

  1. Have attributes aware simplification as suggested
  2. On top of this, it makes perfect sense to expose user defined threshold for each of the control parameters - for example normals crease angle, minimal distance for position weld, color and UV differentiation, etc..

Adi-Amazon avatar Feb 08 '22 16:02 Adi-Amazon

Copying the comment from #524 on some future work involved here; the issue will stay open as the algorithm improves further:

  • The attribute metric is not perfect - it's functioning correctly and is numerically stable, but it misses certain obvious visual errors. I have some ideas on how to improve this but it requires significant math modeling work.
  • The attribute quadrics are not properly aggregated across discontinuities. This is the case for Godot's fork as well.
  • The resulting error, as well as error limit, include the attribute error. Godot's fork adjusts output error to only track distance, but keeps error limit as is. I might instead add a second output error parameter, we'll see. This also requires tracking both errors, which increases the collapse list structure if done naively.
  • The attribute and geometry errors are hard to balance. There are some ideas I'd like to try around this, but right now very careful weight tuning is required for good results, and the weights strongly depend on the type of attribute involved.
  • In presence of attributes, some automatic optimizations like vertex welding are possible that would significantly improve the quality for some topology-constrained meshes.

zeux avatar Nov 02 '23 20:11 zeux