RegTR
RegTR copied to clipboard
Extension of algorithm
Hello,Is this algorithm suitable for underwater point cloud map registration? If appropriate, how should the dataset be processed?Is the dataset unlabeled?(For unsupervised learning)
Hi. I don’t have experience with underwater point clouds so I am unable to advise. The work is a supervised learning algorithm and requires ground truth registrations for training.
On Sat, 15 Apr 2023 at 4:30 PM, lcxiha @.***> wrote:
Hello,Is this algorithm suitable for underwater point cloud map registration? If appropriate, how should the dataset be processed?Is the dataset unlabeled?(For unsupervised learning)
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP66ZOOLT5BDDJB543XLXBJMBNANCNFSM6AAAAAAW7ISRGA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Thanks a lot! There are still some questions I would like to consult: 1. The ground truth (label) is the true transformation R and T between two point clouds, right? 2.Is it the same dataset as the one used in the paper predator?
Looking forward to your reply.
-
Yes the ground truth label is the pose (rotation+translation). Although internally it converts them into the ground truth corresponding locations for training, since that’s what the network outputs.
-
Yes, the dataset is the same as Predator.
Zi Jian
On Mon, 17 Apr 2023 at 10:11 AM, lcxiha @.***> wrote:
Thanks a lot! There are still some questions I would like to consult: 1. The ground truth (label) is the true transformation R and T between two point clouds, right? 2.Is it the same dataset as the one used in the paper predator? Looking forward to your reply.
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14#issuecomment-1510592033, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP6ZP7KJNKIGVE7QEWITXBSRDZANCNFSM6AAAAAAW7ISRGA . You are receiving this because you commented.Message ID: @.***>
Ok,thanks.If I want to use this algorithm for unlabeled datasets, how can I obtain labels for ground truth? Do you have any good suggestions? Looking forward to your reply.
Hi,Does"Although internally it converts them into the ground truth corresponding locations for training, since that’s what the network outputs."refer to Positional encodings?
Ok,thanks.If I want to use this algorithm for unlabeled datasets, how can I obtain labels for ground truth? Do you have any good suggestions? Looking forward to your reply.
This is tricky and depends on your problem. Some possible solutions are rely on external sensors, e.g. mocap systems, or semi-manual/manual registrations.
Hi,Does"Although internally it converts them into the ground truth corresponding locations for training, since that’s what the network outputs."refer to Positional encodings?
What I meant was that the training loss is based on the point coordinates, and not the rotation/translation.
Thank you very much for your patient answer! Is this algorithm suitable for odometriyKITTI datasets or only point cloud datasets for slam loop back detection? Looking forward to your reply.
Lidar Point clouds like those from KITTI should be fine. After the paper publication, I did try it briefly on KITTI point cloud matching and a KITTI-trained model worked well.
Not sure what you mean by odometry vs SLAM loop back. But for odometry where you already have a good initial guess, local algorithms like ICP tend to work well and give precise results. RegTR will work but might not be the best use of the algorithm.
On Wed, Apr 19, 2023 at 8:01 PM lcxiha @.***> wrote:
Thank you very much for your patient answer! Is this algorithm suitable for odometriyKITTI datasets or only point cloud datasets for slam loop back detection? Looking forward to your reply.
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14#issuecomment-1514608318, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP65LLUHBKVHJCH5VIC3XB7HX3ANCNFSM6AAAAAAW7ISRGA . You are receiving this because you commented.Message ID: @.***>
What I mean is that I want to use this algorithm to achieve the registration of two overlapping point cloud maps. If we currently know the rotation and translation matrices of our dataset and the overlap rate of two point clouds, we would like to use this algorithm to achieve point cloud map registration.Do you think it feasible?
If you have the rotation and translation matrices of your dataset you should be able train RegTR on your dataset. As mentioned, the algorithm should work on lidar point clouds like those in KITTI (although some parameter tuning might necessary).
Zi Jian
On Wed, 19 Apr 2023 at 8:37 PM, lcxiha @.***> wrote:
What I mean is that I want to use this algorithm to achieve the registration of two overlapping point cloud maps. If we currently know the rotation and translation matrices of our dataset and the overlap rate of two point clouds, we would like to use this algorithm to achieve point cloud map registration.Do you think it feasible?
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14#issuecomment-1514661895, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP65FHWMAEIVEVMQZL6TXB7MBTANCNFSM6AAAAAAW7ISRGA . You are receiving this because you commented.Message ID: @.***>
Thank you very much for your patient answer. I have another question: May I ask if the dataset used in the paper is in the global coordinate system or in the carrier coordinate system.Looking forward to your reply.
Not 100% sure, but I suspect the coordinates of the points are w.r.t. to the camera, i.e. origin is where the camera is. Possibly you can consult the paper on the dataset to have a definite answer.
Thanks a lot! Given the rotation matrix and translation matrix, I want to visualize the point cloud registration of two frames with overlapping rate. How do I build this code?
For visualizing cloud clouds, look at my demo code (VTK based). Or you can use the visualization functions provided in open3d.
On Fri, 5 May 2023 at 4:17 PM, lcxiha @.***> wrote:
Thanks a lot! Given the rotation matrix and translation matrix, I want to visualize the point cloud registration of two frames with overlapping rate. How do I build this code?
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14#issuecomment-1535900684, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP63ZZ2HJN2GXKUGESFTXESZSBANCNFSM6AAAAAAW7ISRGA . You are receiving this because you commented.Message ID: @.***>
Sorry, I may not have expressed my question clearly. I want to visualize two point clouds with overlapping frames, but registration is not done with the trained model. Can the alignment of two point clouds (with a certain overlap rate) be performed when the transformation matrix of two point clouds is known?
Sorry I don’t quite understand. If you already have the transformation matrix between the point clouds, aren’t they already aligned?
Zi Jian
On Tue, 9 May 2023 at 5:23 PM, lcxiha @.***> wrote:
Sorry, I may not have expressed my question clearly. I want to visualize two point clouds with overlapping frames, but registration is not done with the trained model. Can the alignment of two point clouds (with a certain overlap rate) be performed when the transformation matrix of two point clouds is known?
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14#issuecomment-1539626121, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP622ZLYXBQCXBIUV65TXFIEIRANCNFSM6AAAAAAW7ISRGA . You are receiving this because you commented.Message ID: @.***>
I'm sorry, I want to verify if the dataset can be aligned under the provided transformation matrix. Now I know! However, I have a question: why does the test dataset (test-3DMatch_info.pkl and test-3DLoMatch_info.pkl) still have a transformation matrix? Isn't it only the training dataset (train_info. pkl) and the validation dataset (vai_info. pkl) that have labels (transformation matrices)?
The transformation matrices are used for evaluation.
Zi Jian
On Mon, 15 May 2023 at 9:50 AM, lcxiha @.***> wrote:
I'm sorry, I want to verify if the dataset can be aligned under the provided transformation matrix. Now I know! However, I have a question: why does the test dataset (test-3DMatch_info.pkl and test-3DLoMatch_info.pkl) still have a transformation matrix? Isn't it only the training dataset (train_info. pkl) and the validation dataset (vai_info. pkl) that have labels (transformation matrices)?
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14#issuecomment-1547080828, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP6ZBTCHMUAY6B4QHJODXGGDUZANCNFSM6AAAAAAW7ISRGA . You are receiving this because you commented.Message ID: @.***>
I don't know if my understanding is correct: the transformation matrices in the training dataset and validation dataset are used for training and evaluation, while the transformation matrices in the test dataset are only used for evaluation, that is to evaluate the quality of the registration results of the two point clouds.
And the evaluation result of registration is reflected in the loss function.
@yewzijian Can you share what parameter file you used for training the model on KITTI?
Sure, will look for it and post it here by the end of the week.
On Thu, 6 Jul 2023 at 3:17 AM, Aniket Gupta @.***> wrote:
@yewzijian https://github.com/yewzijian Can you share what parameter file you used for training the model on KITTI?
— Reply to this email directly, view it on GitHub https://github.com/yewzijian/RegTR/issues/14#issuecomment-1622329533, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADIBP63JWCOOG2VUNS2CDTLXOW4VRANCNFSM6AAAAAAW7ISRGA . You are receiving this because you were mentioned.Message ID: @.***>
Hi @aniket-gupta1 , see the following for my Kitti parameters. I didn't really spend much time tuning these parameters though so you might be able to find better ones. kitti.yaml.zip
@yewzijian Thankyou!