votenet
votenet copied to clipboard
Question about how to calculate tilt angle for a new RGBD image like SUN-RGBD dataset?
Hi, I noticed that in the training process of SUN RGB-D
dataset, the point cloud's axis is aligned to the gravity direction.
If I have a new RGBD image taken from Kinect V2
camera, I will also have to know the tilt angle before sending it into VoteNet
. The tilt angle calculation is not mentioned in this repo. Do you have any suggestions on how to calculate it? Are there any ready-made tools available?
Any suggestions will be helpful. Thanks very much!
I want to implement a similar strategy for ImVoteNet. Did you get any solution for this?
I found the following in the paper of sunrgbd: 2.5. Ground truth annotation For each RGB-D image, we obtain LabelMe-style 2D polygon annotations, 3D bounding box annotations for objects, and 3D polygon annotations for room layouts.
For 3D annotation, the point clouds are first rotated to align with the gravity direction using an automatic algorithm. We estimate the normal direction for each 3D point with the 25 closest 3D points. Then we accumulate a histogram on a 3D half-sphere and pick the maximal count from it to obtain the first axis. For the second axis, we pick the maximal count from the directions orthogonal to the first axis. In this way, we obtain the rotation matrix to rotate the point cloud to align with the gravity direction. We manually adjust the rotation when the algorithm fails.
I have the same problem