geometry icon indicating copy to clipboard operation
geometry copied to clipboard

QH6227 impossible to triangulate a dataset

Open Jean-Romain opened this issue 7 years ago • 12 comments

Hi,

I found a dataset I'm not able to triangulate. There is no trick in this dataset such as co-circular points but I can't apply the delaunayn function. The dataset is provided in attachment.

X = read.table("points.txt")
X = as.matrix(X)

dn = geometry::delaunayn(X)

QH6227 qhull precision error: Only 4 facets remain. Can not merge another pair. The input is too degenerate or the convexity constraints are too strong. Error in geometry::delaunayn(X) : Received error code 1 from qhull.

Since it is a qhull precision error I tried some options. The only one which enable to run the function without crashing is the option Qz (not appropriated). It does not crash but as reported in #11 t returns a very wrong triangulation.

dn = geometry::delaunayn(X, options = "Qz")

plot(X, asp = 1)
geometry::trimesh(dn, X, add = T)

Thank you for your help.

points.txt

Jean-Romain avatar Jun 24 '17 18:06 Jean-Romain

I think this is a precision issue in QHull. It has cropped up elsewhere: https://github.com/scipy/scipy/issues/4974, https://github.com/scipy/scipy/issues/6484

I have discovered a workaround for this case, which is to normalise the points around (0, 0):

X = read.table("points.txt")
X = as.matrix(X)

# Gives error QH6227
# dn = geometry::delaunayn(X)

# No error, but bad triangulation
# dn = geometry::delaunayn(X, options = "Qz")

# Normalise points around (0, 0)
X1 = cbind(X[,1]- mean(X[,1]),
           X[,2]- mean(X[,2]))

## Seems to give good triangulation
dn1 = geometry::delaunayn(X1)

plot(X1, asp = 1)
geometry::trimesh(dn1, X1, add = T)

I suppose I could implement this workaround in the R package and put in a bug report to @qhull .

davidcsterratt avatar Jun 26 '17 09:06 davidcsterratt

The QbB option also seems to fix the problem.

dn2 = geometry::delaunayn(X, options="QbB")
plot(X, asp = 1, main="options=\"QbB\"")
geometry::trimesh(dn2, X, add = T)

I think it might be safe in general - but I need to think more before making it a default.

davidcsterratt avatar Jun 26 '17 11:06 davidcsterratt

QbB and the mean subtraction don't give identical results, but they are very similar and probably "good enough".

X = read.table("points.txt")
X = as.matrix(X)

par(mfcol=c(2,2))

## Gives error QH6227
## dn = geometry::delaunayn(X)

## No error, but bad triangulation
dn = geometry::delaunayn(X, options = "Qz")
plot(X, asp = 1, main="options=\"Qz\"")
geometry::trimesh(dn, X, add = T)

## Normalise points around (0, 0)
X1 = cbind(X[,1]- mean(X[,1]),
           X[,2]- mean(X[,2]))

## Seems to give good triangulation
dn1 = geometry::delaunayn(X1)

plot(X1, asp = 1, main="mean subtracted")
geometry::trimesh(dn1, X1, add = T)

## The QbB option also seems to work
dn2 = geometry::delaunayn(X, options="QbB")
plot(X, asp = 1, main="options=\"QbB\"")
geometry::trimesh(dn2, X, add = T)

## Overlay
plot(X, asp = 1, main="mean subtracted; options=\"QbB\" (in red)")
geometry::trimesh(dn1, X, add = T)
geometry::trimesh(dn2, X, add = T, col="red")

order.triangulation <- function(x) {
  x <- t(apply(x, 1, sort))
  return(x[order(x[,1], x[,2], x[,3]),])
}

message("Are the mean substracted and QbB triangulations identical?")
print(all(order.triangulation(dn1) == order.triangulation(dn2)))

davidcsterratt avatar Jun 26 '17 11:06 davidcsterratt

Thank you for your answer. I missed the option QbB. It basically what I was looking for.

I think it might be safe in general - but I need to think more before making it a default.

It sounds safe but I have the same issue, I would like to ensure it's always safe. Please let me know if you have more informations on the subject.

QbB and the mean subtraction don't give identical results, but they are very similar and probably "good enough".

How is it possible? The Delaunay triangulation is expected to be unique (but co-circular points). Therefore a translation or a scaling of the point cloud should not change the result. Isn't it?

Regards

Jean-Romain avatar Jun 26 '17 15:06 Jean-Romain

A translation should definitely not make a difference. I think that QbB might stretch by different amounts in different directions, which could account for the difference. However, I've checked this by doing this scaling before feeding to delaunayn without the "QbB" option, and scaled and translated give the same results... I will take this up with Brad Barber.

davidcsterratt avatar Jun 26 '17 17:06 davidcsterratt

Hi,

I have a similar issue using convhulln. Before finding this issue, I managed to solve my issue by using Qbb.

I also tried with QbB as mentioned here but I get quite different results and I am not sure which one I should use anymore...

As you said, I didn't expect scaling to have an effect on the result. Unless the scaling factor is indeed different in different directions. But if it changes the shape of the point cloud, doesn't that make it unsuitable for some applications? I think it does for me at least.

Did you get the chance to discuss / think about this issue a bit more?

Thank you for your time!

(PS: here is my dataset if you want to poke around) convhull_data.txt

Bisaloo avatar Jun 19 '18 12:06 Bisaloo

Both datasets are degenerate when represented as double-precision floating point. This is best seen by exploring the data with qhull directly (dim numpoints coordinates ...).

Starting with Jean-Romain's point set and using Qs to search for the best initial simplex (4 input sites)

qhull <../working/points-davidcsterratt-geometry-issue-12.txt d QH6114 qhull precision error: initial simplex is not convex. Distance=-0.00024

The maximum round off error for computing distances is 0.024.

If I surround Jean-Romain's point set with a bounding box (+-40), and use 'Qs' to search for the best initial simplex. Qhull reports a precision error as it ought to.

Add a bounding box to the point set

2 750 plus rbox B40 O695930 D2 c; rbox B40 O5078350 D2 c 695910 5078310 695910 5078390 695990 5078310 695990 5078390 695934.87 5078340.92 ...

options 'd Qs'

qhull <../working/points-davidcsterratt-geometry-issue-12.txt d Qs

The input to qhull appears to be less than 3 dimensional, or a computation has overflowed.

Qhull could not construct a clearly convex simplex from points:

  • p427(v3): 7e+005 5.1e+006 2.6e+013
  • p1(v2): 7e+005 5.1e+006 2.6e+013
  • p2(v1): 7e+005 5.1e+006 2.6e+013
  • p0(v0): 7e+005 5.1e+006 2.6e+013

It reports a nearly flat bounding box for the projected convex hull

The min and max coordinates for each dimension are: 0: 6.959e+005 6.96e+005 difference= 80 1: 5.078e+006 5.078e+006 difference= 80 2: 2.627e+013 2.627e+013 difference= 9.239e+008

If option 'Qz' is added (a point "at-infinity"), the initial simplex can be created. But all of the original input sites are nearly incident and dropped from the results

qhull <../working/points-davidcsterratt-geometry-issue-12.txt d Qz

Number of input sites and at-infinity: 5 Total number of nearly incident points: 746 Number of Delaunay regions: 1 Number of non-simplicial Delaunay regions: 1

If the size of the bounding box is increased. Option Qz is not needed, but the original input sites remain nearly-incident

New bounding box points

695010 5077310 695010 5079390 696990 5077310 696990 5079390

qhull <../working/points-davidcsterratt-geometry-issue-12.txt d Qz

Delaunay triangulation by the convex hull of 750 points in 3-d:

Number of input sites: 5 Total number of nearly incident points: 745 Number of Delaunay regions: 4

cbbarber avatar Jun 26 '18 02:06 cbbarber

In the comments above, Qhull reported the maximum error due to roundoff

round off error for computing distances is 0.024

This is substantially more than the reported distances of the center point (avg of coordinates) to each facet of the initial simplex

center point 6.959e+005 5.078e+006 2.627e+013

facet p1 p2 p0 distance= -7.8e-005 facet p427 p2 p0 distance= -0.00016 facet p427 p1 p0 distance= -0.00016 facet p427 p1 p2 distance= -0.0073


Bisaloo's data is likewise degenerate. Even when transformed to the unit bounding box, the cosine of the min. angle between facets is 1.0000000000000000. The input sites are a flat

qhull <../working/convhull_data-davidcsterratt-geometry-issue-12.txt Qs Qbb


The general problem is insufficient digits to construct a convex hull or delaunay triangulation.

When the data is transformed to gain more digits (as described in this thread), the transformation itself introduces roundoff error. That's why different triangulations are produced depending on how the data is transformed.


Suggestion -- Use a high precision package to transform the data. Then no precision is lost due to round-off. But you should be careful about artificial results due to the newly created bits that are not in the original data. Consider introducing random bits instead of '0' as is normally done. If the results are remain stable over multiple runs, they reflect the original data.

Suggestion -- A bounding box is useful for Delaunay triangulation of degenerate data and for visualizing unbounded regions. It should be a small multiple of the diameter of the input sites. You could generate a more accurate bounding box from the convex hull of the input sites.

With a bounding box, Qhull can merge facets until a clearly-convex result is produced. These merged facets reflect the underlying precision of the input data.

I hope this helps. Precision issues are not easy. That's the reason for Qhull. It tries to do the best it can with the available data.

cbbarber avatar Jun 26 '18 02:06 cbbarber

As reported in scipy/scipy#6484, Qhull 2019.1 will fix this problem. It fails on qhull 2015.2 due to qh_checkflipped reporting a flipped facet to qh_initialhull, which incorrectly reverses the facet orientation. So when the first points are added, every new facet is flipped, leading to the reported error.

qh_checkflipped: facet f1 is flipped, distance= -3.98348228101e-009 during p-1 qh_initialhull: initial orientation incorrect. Correct all facets

It works now, but not because of these fixes. Instead, Qhull 2019.0.1 contains a 'maybe_falsenarrow' test in qh_maxsimples for a substantial decrease in the determinant (4.7e2 to 9.4e-6). This triggers a search of all points (i.e., the same as 'Qs')

[QH0017]qh_maxsimplex: searching all points for 4-th initial vertex, better than p476 det 9.4e-06 or prevdet 4.7e+02

  • geom2_r.c/qh_maxsimplex: add ratio test for false narrow hull with search for prevdet [J. Romain]

Delaunay triangulation by the convex hull of 746 points in 3-d:

Number of input sites: 722 Total number of deleted points due to merging: 4 Total number of nearly incident points: 20 Number of Delaunay regions: 921 Number of non-simplicial Delaunay regions: 377

Statistics for: davidcsterratt issue12 | qhull d Qbb Qa

Number of points processed: 726 Number of hyperplanes created: 3552 Number of facets in hull: 951 Number of distance tests for qhull: 26907 Number of distance tests for merging: 34733 Number of distance tests for checking: 13858 Number of merged facets: 1045 CPU seconds to compute hull (after input): 0.015 Maximum distance of point above facet: 4.1e-08 (1.0x) Maximum distance of vertex below facet: -7e-08 (1.7x)

cbbarber avatar Jun 18 '19 02:06 cbbarber

Qhull 2019.1 continues to fail for Bisaloo's data. Even if qh_maxsimplex searches for the best initial simplex ('Qs'), qh_intialhull reports that qh.interior_point is barely more than qh.DISTround (4e-16) to two facets. The input set is rather flat (p-2 is qh.interior_point)

h_distplane: -5.315572201153307e-16 from p-2 to f2 qh_distplane: -2.016148099165394e-12 from p-2 to f3 qh_distplane: -3.26932410180461e-13 from p-2 to f4 qh_distplane: -2.016071798437505e-12 from p-2 to f5 qh_distplane: -5.31611430223955e-16 from p-2 to f1

You could add a point at right angles to the flat and then ignore the point when reporting the convex hull. It would give you one side of the convex hull of the original point set.

cbbarber avatar Jun 18 '19 03:06 cbbarber

Hi,

I found a dataset I'm not able to triangulate. There is no trick in this dataset such as co-circular points but I can't apply the delaunayn function. The dataset is provided in attachment.

Works flawlessly with my new package RCGAL.

library(RCGAL)

X <- as.matrix(read.table("points.txt"))

d2 <- delaunay(X)

plotDelaunay2D(d2)

delaunay2D

stla avatar Jan 13 '22 17:01 stla

This is a precision issue due to Qhull computing the Delaunay triangulation by projecting the input to a paraboloid. Your points are in a narrow range [50783..., 6959...] that is shifted relative to the origin. So the projection O(n^2) decreases the number of significant digits that are available.

Add Qhull option 'Qbb' when performing a Delaunay triangulation. It scales the projected coordinate and should avoid this issue.
http://www.qhull.org/html/qh-optq.htm#Qbb

Alternatively shift the points to the origin.

Option 'Qbb' is automatically set for the 'qdelaunay' program http://www.qhull.org/html/qdelaun.htm

cbbarber avatar Jan 17 '22 04:01 cbbarber