Inquiry about running NTU-VIRAL dataset with config_ntu.yaml
Thank you for your excellent open-source lidar odometry.
I evaluated the TRAJ-LO on the NTU-VIRAL dataset with the ./trajlo ../data/config_ntu.yaml, but could not get consistent ATE results with the paper published on RA-L. The ATE is getting from the official evaluation python script from NTU-VIRAL dataset . Could you give me some advice about how to get a consistent ATE with the results illustrated in your paper?
Moreover, the ./trajlo ../data/config_ntu.yaml seems useful for single lidar odometry, but does not work for the multi-lidar odometry. How can I run TRAJ-LO with multiple lidar?
Looking forward to your reply.
一楼吃瓜
*grabs popcorn
@shenhm516 In SLICT I wrote a mergelidar node. Please check it out here:
https://github.com/brytsknguyen/slict/blob/master/src/MergeLidar.cpp
Not that the output will have the point time stamp in nanosecond relative to the message header time stamp.
Example of how to run the merged lidar with FAST LIO can be found here
https://mcdviral.github.io/SLAMTutorial.html#fast_lio
@shenhm516 In SLICT I wrote a mergelidar node. Please check it out here:
https://github.com/brytsknguyen/slict/blob/master/src/MergeLidar.cpp
Not that the output will have the point time stamp in nanosecond relative to the message header time stamp.
Example of how to run the merged lidar with FAST LIO can be found here
https://mcdviral.github.io/SLAMTutorial.html#fast_lio
Cool!
I have merged two lidar point clouds successfully with the slict_sensorsync node open-sourced in SLICT.
Thank you very much @brytsknguyen .
And Also don't forget to account for the offset between GT Leica Prism frame and Sensor frame
Hi @snakehaihai,
Thank you for your reminder. The offset between the Leica Prism frame and Sensor frame seems already considered in the evaluation Python script.
Please let me know if I have any misunderstanding.
Thats right. thx
Thank you for your excellent open-source lidar odometry.
I evaluated the TRAJ-LO on the NTU-VIRAL dataset with the
./trajlo ../data/config_ntu.yaml, but could not get consistent ATE results with the paper published on RA-L. The ATE is getting from the official evaluation python script from NTU-VIRAL dataset . Could you give me some advice about how to get a consistent ATE with the results illustrated in your paper? Moreover, the./trajlo ../data/config_ntu.yamlseems useful for single lidar odometry, but does not work for the multi-lidar odometry. How can I run TRAJ-LO with multiple lidar?Looking forward to your reply.
@shenhm516 Sorry for the late reply. I spent the whole June on my graduation road trip in Xinjiang, China, so I didn't respond to this issue immediately. By the way, thanks for the valuable advice from @brytsknguyen and @snakehaihai.
The released code was refactored compared to the original version published in the paper, but the final ATE results should be consistent without significant differences. I used the Python package EVO for the evaluation.
The issue may arise from the frame used for evaluation: LiDAR, body (IMU), or Leica Prism. As @snakehaihai mentioned, you should compensate for the offset between the GT Leica Prism frame and the sensor frame. However, one additional step is needed in our LiDAR-only method. The estimated trajectory in Traj-LO belongs to the LiDAR frame, but the T_Body_Prism in the config file is between the Prism frame and the IMU frame. Therefore, you should transform the trajectory to the IMU frame and then compensate for the offset of the Leica Prism as shown in the following code:
https://github.com/kevin2431/Traj-LO/blob/ba273d36d4b69a557eedd19ba80e1c90ab30039d/src/core/odometry.cpp#L162-L163
I have added a pose-saving function in the new commit; you can test it on the NTU-VIRAL dataset. As for the multi-LiDAR odometry, @brytsknguyen has provided a way to merge LiDARs, and I will try my best to update this module in the following commit.
Thank you for your excellent open-source lidar odometry. I evaluated the TRAJ-LO on the NTU-VIRAL dataset with the
./trajlo ../data/config_ntu.yaml, but could not get consistent ATE results with the paper published on RA-L. The ATE is getting from the official evaluation python script from NTU-VIRAL dataset . Could you give me some advice about how to get a consistent ATE with the results illustrated in your paper? Moreover, the./trajlo ../data/config_ntu.yamlseems useful for single lidar odometry, but does not work for the multi-lidar odometry. How can I run TRAJ-LO with multiple lidar? Looking forward to your reply.@shenhm516 Sorry for the late reply. I spent the whole June on my graduation road trip in Xinjiang, China, so I didn't respond to this issue immediately. By the way, thanks for the valuable advice from @brytsknguyen and @snakehaihai.
The released code was refactored compared to the original version published in the paper, but the final ATE results should be consistent without significant differences. I used the Python package EVO for the evaluation.
The issue may arise from the frame used for evaluation: LiDAR, body (IMU), or Leica Prism. As @snakehaihai mentioned, you should compensate for the offset between the GT Leica Prism frame and the sensor frame. However, one additional step is needed in our LiDAR-only method. The estimated trajectory in Traj-LO belongs to the LiDAR frame, but the T_Body_Prism in the config file is between the Prism frame and the IMU frame. Therefore, you should transform the trajectory to the IMU frame and then compensate for the offset of the Leica Prism as shown in the following code:
Lines 162 to 163 in ba273d3
Sophus::SE3d pose_body=config_.T_body_lidarp.secondconfig_.T_body_lidar.inverse(); Sophus::SE3d pose_gt=pose_body*config_.T_body_gt; I have added a pose-saving function in the new commit; you can test it on the NTU-VIRAL dataset. As for the multi-LiDAR odometry, @brytsknguyen has provided a way to merge LiDARs, and I will try my best to update this module in the following commit.
Hi @kevin2431, thank you for the update.
The extrinsic between LiDAR, body, and Leica Prism are considered.
The re-evaluate performance of Traj-LO over the NTU-VIRAL dataset is attached below. Traj-LO demonstrated excellent accuracy on all sequences except for tnp and spms. I suspect that different sequences may require different parameter to achieve the result demonstrated in Table II of your paper.
@shenhm516 The SPMS scenario presents an extremely low-feature environment, where the UAV operates in a semi-open airspace. In principle, this setting should introduce significant challenges for perception and localization due to the lack of features.
As for the TNP site, there are two distinct place that I forgot which one we open: one is located in a carpark, and the other is indoors. The carpark environment suffers from flat surface degeneracy, offering minimal geometric constraints for reliable localization. The indoor TNP setup, on the other hand, is dominated by strong reflections, which degrade sensor reliability and multi-modal fusion.
Both configurations pose distinct and representative challenges for robust UAV perception and navigation.
@shenhm516 The SPMS scenario presents an extremely low-feature environment, where the UAV operates in a semi-open airspace. In principle, this setting should introduce significant challenges for perception and localization due to the lack of features.
As for the TNP site, there are two distinct place that I forgot which one we open: one is located in a carpark, and the other is indoors. The carpark environment suffers from flat surface degeneracy, offering minimal geometric constraints for reliable localization. The indoor TNP setup, on the other hand, is dominated by strong reflections, which degrade sensor reliability and multi-modal fusion.
Both configurations pose distinct and representative challenges for robust UAV perception and navigation.
@snakehaihai Thanks for the information. As you mentioned, the spms and tnp scenarios are challenge to the single LiDAR odometry. Some recent works, address the degeneration problem by incorporating multi-LiDAR observations into the estimation process.
我擦 自己人
我擦 自己人
我擦,海哥!!!