CloudCompare icon indicating copy to clipboard operation
CloudCompare copied to clipboard

CANUPO Plugin crashing CloudCompare

Open reppiz opened this issue 3 years ago • 14 comments

Describe the bug When trying to use CANUPO plugin to classify a dataset (roughly 58 million points) CloudCompare crashes everytime with the following output in terminal:

CloudCompare: /home/userName/CloudCompare/plugins/core/Standard/qCanupo/src/ccPointDescriptor.cpp:100.: virtual bool DimensionalityScaleParamsComputer::computeScaleParams(CCCoreLib::ReferenceCloud&, double, float*, bool&): Assertion `totalVariance != 0' failed. Aborted (core dumped)

Expected behaviour Expected for the plugin to work.

Additional context This was on a freshly compiled and installed version of CloudCompare which I followed the repo's CI workflow for Ubuntu.

Your environment

  • CC Version: v2.12 beta [64-bit]
  • OS & Version: Ubuntu 21.10 x86_64
  • Kernel: 5.15.12-051512-generic
  • Graphics card: Nvidia Quadro M2000M
  • Qt Version (if compiling): QT5

reppiz avatar Jan 05 '22 19:01 reppiz

Hi, quick question: do you reproduce the crash with a smaller cloud?

And the fact that it crashes on 'asserts' means that you are running it in debug mode?

dgirardeau avatar Jan 05 '22 21:01 dgirardeau

Yes I can reproduce the on a smaller cloud. Although let me elaborate on this because I may be doing something wrong.

My original cloud is 2.0Gb (109 million points) .las file. I segmented out a section (58 million points) which I've been trying to use to classify on. However, I noticed that even my segmented cloud is also 2.0Gb in size. How is it that they are both 2.0Gb in size? Did i do something wrong?

How can I tell if I'm running on debug mode or not?

reppiz avatar Jan 05 '22 21:01 reppiz

Debug: have you compiled CloudCompare yourself? (in which case it's in the compilation option)

Otherwise, when you say the segment cloud is 2Gb in size, you mean the file I guess. Is it after applying any processing or just after the segmentation?

I would be interested to get a small cloud and the parameters you used to reproduce the issue on my side if that's possible.

dgirardeau avatar Jan 05 '22 21:01 dgirardeau

Yes I compiled myself. I used the --recursive option to clone the GitHub repo that was shown in BUILD.md and then I followed the CI workflow for Ubuntu-build on GitHub. But I don't recall seeing a debug option in there. Is it -DBUILD_TESTING=ON?

Yes, exactly. I opened the original cloud 2.0Gb, then segmented out a section and saved it as CloudCompare binary entity and that file was also 2.0Gb but had half the points in it. This confused me a bit.

I would be happy to give you anything you need. Do you have a discord or something? I really need some help with this. I will even pay you for your help.

reppiz avatar Jan 05 '22 21:01 reppiz

Sadly I'm not an expert in Linux compilation (maybe @tmontaigu could help?) but you should be able to define the CMAKE_BUILD_TYPE variable to 'Release' if not already: cmake -DCMAKE_BUILD_TYPE=Release ...

For the file size issue, maybe CC chose another LAS point type than the original one (with much more fields per point)? Hard to tell.

And for the Canupo issue, if you can find a small cloud that causes this issue, and send it to me, along with the parameters you used (typically a screen shot of the dialog) then it would help me to debug the issue. You can send it to my address daniel.girardeau [at] gmail.com. However, I may not be able to work on it right away, cause I'm doing it on my free time (and I can't be paid for this kind of activity ;) ).

dgirardeau avatar Jan 05 '22 21:01 dgirardeau

Right on. Let me do one thing at time so we can eliminate whats actually happening. I'm going to uninstall my current build and rebuild with -DCMAKE_BUILD_TYPE=Release. I will report back if that fixed anything or not. If it doesn't, I will email you a small point cloud with a screenshot of my settings.

I really appreciate the feedback.

reppiz avatar Jan 05 '22 21:01 reppiz

@dgirardeau Okay so I actually got it to successfully classify without crashing. However I can't determine if this was because i rebuilt with -DCMAKE_BUILD_TYPE=Release or if its because I used an entirely different and much smaller (480 Mb) point cloud. Maybe both.

However this raises another question, although off topic from this thread, can I do anything with the descriptors when its done classifying? Like if I'm classifying trees and ground, can you turn off tress for example?

reppiz avatar Jan 06 '22 13:01 reppiz

In Release builds, asserttion (assert(...)) are removed by default so you cannot trigger them

tmontaigu avatar Jan 06 '22 14:01 tmontaigu

Yes, it doesn't mean it's gone. Can you retry in debug maybe (with the small cloud?). And send it to me if you reproduce the issue? (with the parameters).

And regarding your question: normally you are supposed to create a new classifier that differentiates the trees from the rest (and you should apply the classifiers in cascade).

dgirardeau avatar Jan 06 '22 18:01 dgirardeau

Sure. I will do a debug build with the same plugins but will use a smaller cloud. Its possible that I originally was just using a cloud that was too big for my laptop LOL.

reppiz avatar Jan 06 '22 19:01 reppiz

Tried a smaller cloud in debug mode and it worked. So either my computer can't handle a larger cloud, which I'm not sure why considering I have a i7 and 32Gb of RAM. Or I was doing something wrong by trying to classify a segmentation instead of the original cloud.

reppiz avatar Jan 07 '22 13:01 reppiz

Ok, interesting. Still, the fact that you hit an 'assert' means that it was unexpected... Looking at the code, to achieve a total variance of 0, it means that all eigenvalues are 0. Which probably means that all neighbor points are the same. Maybe it would be interesting to run the 'Tools > Other > Remove duplicate points' tool on the large cloud and see if some points are duplicated?

Anyway, in the code I've slightly changed the code so that it's not asserting anymore (the error will be more properly handled).

dgirardeau avatar Jan 08 '22 10:01 dgirardeau

Awesome I, appreciate that. I will clone the new code and compile a new release build. Then I will run that tool first on the BIG cloud and see if I still get an assertion error.

Back to the classifying question above: what I was trying to ask was once I get a classified dataset, is there a way to remove points based on their classification? For example if I classify ground and trees, can I remove points that are classified as "Trees"?

reppiz avatar Jan 10 '22 13:01 reppiz

You should use the 'Edit > Scalar fields > Filter by value' tool (see the wiki for more information if needed).

dgirardeau avatar Jan 10 '22 20:01 dgirardeau