RandLA-Net icon indicating copy to clipboard operation
RandLA-Net copied to clipboard

Error when tesing main_SemanticKITTI.py

Open xkhnhms opened this issue 3 years ago • 5 comments

xkhnhms avatar Jun 08 '21 07:06 xkhnhms

The detailed error report is as follows:

`

File "main_SemanticKITTI.py", line 85, in spatially_regular_gen selected_pc, selected_labels, selected_idx = self.crop_pc(pc, labels, tree, pick_idx)

File "main_SemanticKITTI.py", line 125, in crop_pc select_idx = search_tree.query(center_point, k=cfg.num_points)[1][0]

File "sklearn/neighbors/_binary_tree.pxi", line 1342, in sklearn.neighbors._kd_tree.BinaryTree.query

ValueError: k must be less than or equal to the number of training points

 [[{{node PyFunc}} = PyFunc[Tin=[DT_INT64], Tout=[DT_FLOAT, DT_INT32, DT_INT32, DT_INT32], token="pyfunc_7"](arg0)]]
 [[{{node IteratorGetNext}} = IteratorGetNext[output_shapes=[[?,?,3], [?,?,3], [?,?,3], [?,?,3], <unknown>, ..., <unknown>, [?,?,3], [?,?], [?,?], [?,?]], output_types=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_INT32, ..., DT_INT32, DT_FLOAT, DT_INT32, DT_INT32, DT_INT32], _device="/job:localhost/replica:0/task:0/device:CPU:0"](IteratorV2)]]

`

I have solved it, thank you very much.

xkhnhms avatar Jun 08 '21 07:06 xkhnhms

hey, I got the same problem! can you tell me how you solved it? it has connection to computer's performence?

Adirosenthal540 avatar Jun 10 '21 10:06 Adirosenthal540

hey, I got the same problem! can you tell me how you solved it? it has connection to computer's performence?

It mainly depends on the number of points in each point cloud in the data set I tested. When the number of points is relatively small, the number of input points needs to be reduced. This is because the source program samples the point cloud.

For example: helper_tool.py -> ConfigSemanticKITTI->num_points=?(Reduce as little as possible )

xkhnhms avatar Jun 10 '21 10:06 xkhnhms

To maybe add to this issue: A similar error (ValueError: k must be less than or equal to the number of training points) occurs when testing on point cloud data that is still in its original (global) coordinate system. When the coordinates of the points are very large (X = ~10 000, Y = ~10 000, Z = ~1000) this error was shown. To solve this, centralize your point cloud around the origin so the centre of the boundingbox is around (0,0,0)

lukasmattheuwsen avatar Nov 23 '21 09:11 lukasmattheuwsen

num_points = 8192 worked for toronto3d dataset.

Connect2Aditya avatar Oct 02 '22 13:10 Connect2Aditya