rtabmap_ros icon indicating copy to clipboard operation
rtabmap_ros copied to clipboard

Converting the Bonn RGB-D Dynamic Dataset to ROS2 Bag Files for RTAB-Map Evaluation

Open yongyun-iastate opened this issue 3 months ago • 3 comments

Hello, I would like to test the Bonn RGB-D Dynamic Dataset with rtabmap_ros. Could I ask if there is a way to convert the RGB and depth PNG files into a rosbag, similar to the TUM dataset? I know that rtabmap can be used in standalone mode, but I am testing it together with another module in ROS.

I tried converting the dataset to a ROS1 bag and then converting the ROS1 bag to a ROS2 bag, just like how the TUM dataset is converted from ROS1 to ROS2. However, when I tested and evaluated the ATE, the error was significantly large, which is strange because the camera is not moving much. I assume I may have converted /tf incorrectly from the dataset to ROS.

yongyun-iastate avatar Sep 10 '25 02:09 yongyun-iastate

Here is a way to convert at least the images to a rosbag with the current version. I'll make another follow-up post with some changes needed in the library to also include the ground truth.

Summary: the idea is to use rtabmap-dataRecorder to convert the directory of images into a database, then convert the database to a rosbag by replaying it with rosrun rtabmap_util data_player node.

Details:

  1. We have to pre-synchronize the rgb with depth images, to do so use this associate.py script. Put it inside the dataset folder and do (it should create two folders called rgb_sync and depth_sync):
$ python3 associate.py rgb.txt depth.txt
  1. Open rtabmap standalone, open Preferences, reset all settings to default, then move to Source tab. Keep input source type as RGB-D, then scroll down to setup these parameters:
Image

The first one is the camera calibration file rgbd_boon_calib.yaml based on the intrinsics provided by the dataset:

%YAML:1.0
---
camera_name: rgbd_boon
image_width: 640
image_height: 480
camera_matrix:
   rows: 3
   cols: 3
   data: [ 5.4282284100000004e+02, 0., 3.1559352000000001e+02, 0.,
       5.4257686999999999e+02, 2.3775609800000001e+02, 0., 0., 1. ]
distortion_coefficients:
   rows: 1
   cols: 5
   data: [ 3.9903000000000001e-02, -9.9343000000000001e-02,
       -7.2999999999999996e-04, -1.4400000000000000e-04, 0. ]
distortion_model: plumb_bob
local_transform:
   rows: 3
   cols: 4
   data: [ 0., 0., 1., 0., -1., 0., 0., 0., 0., -1., 0., 0. ]

Then set the paths to rgb_sync and depth_sync folders generated in step 1.

  1. To convert in the rosbag later with right timestamps, check that box: Image

  2. Close Preferences dialog and do ctrl-s to save the config.

  3. Run rtabmap-dataRecorder:

$ rtabmap-dataRecorder ~/.rtabmap/rtabmap.ini output.db
Image
  1. Replay the database under ros and record a new rosbag:
$ roscore
$ rosparam set use_sim_time true
$ rosbag record /tf /depth_registered/image /depth_registered/camera_info /rgb/image /rgb/camera_info
$ rosrun rtabmap_util data_player _database:=output.db --clock
  1. Testing the bag with rtabmap_ros:
$ roslaunch rtabmap_launch rtabmap.launch \
   args:="-d" \
   rgb_topic:=/rgb/image \
   camera_info_topic:=/rgb/camera_info \
   depth_topic:=/depth_registered/image \
   frame_id:=base_link \
   use_sim_time:=true

# Adjust name to bag generated in the previous step:
$ rosbag play --clock 2025-09-20-23-56-09.bag

matlabbe avatar Sep 21 '25 00:09 matlabbe

To include the ground truth in the rosbag as a TF frame, you will need to build/install rtabmap with this commit, and rtabmap_ros with this commit.

  1. Following the config set in the previous, post add these settings: Image

  2. Follow steps 4 to 7 of the previous post to generate the rosbag.

  3. Testing in rviz:

roscore
rosparam set use_sim_time true
rviz
# We have to link the base_link frame to the ground truth base frame, 
# to be able to visualize the point clouds in world frame:
rosrun tf static_transform_publisher 0 0 0 0 0 0 base_link_gt base_link 100
rosbag play --clock 2025-09-21-00-52-56.bag

Image

  1. Testing with rtabmap_ros:
$ roslaunch rtabmap_launch rtabmap.launch \
   args:="-d" \
   rgb_topic:=/rgb/image \
   camera_info_topic:=/rgb/camera_info \
   depth_topic:=/depth_registered/image \
   frame_id:=base_link \
   use_sim_time:=true \
   ground_truth_frame_id:=world \
   ground_truth_base_frame_id:=base_link_gt

$ rosbag play --clock 2025-09-21-00-52-56.bag
Image

ROS2

~Unfortunately, data_player node doesn't exist yet under ROS2 to be able to convert to a ROS2 rosbag directly. I'll check if I can get this node ported soon.~ See post below.

matlabbe avatar Sep 21 '25 01:09 matlabbe

ROS2 Example

Following up instructions above, I ported data_player node on ROS2 in this commit.

  1. Do the same steps above to get the output.db file.

  2. Convert RTAB-Map's db file to a ros2 bag directly:

$ ros2 bag record tf /depth/image /depth/camera_info /rgb/image /rgb/camera_info --use-sim-time
$ ros2 run rtabmap_util data_player --clock --ros-args -p database:=output.db -p ignore_odom:=true
  1. Testing in rviz2:
$ ros2 run rviz2  rviz2  --ros-args -p use_sim_time:=true
# We have to link the base_link frame to the ground truth base frame, 
# to be able to visualize the point clouds in world frame:
$ ros2 run tf2_ros static_transform_publisher --frame-id base_link_gt --child-frame-id base_link
$ ros2 bag play rosbag2_2025_09_21-16_52_40 --clock
  1. Testing with RTAB-Map:
$ ros2 launch rtabmap_launch rtabmap.launch.py \
   args:="-d" \
   rgb_topic:=/rgb/image \
   camera_info_topic:=/rgb/camera_info \
   depth_topic:=/depth/image \
   frame_id:=base_link \
   use_sim_time:=true \
   ground_truth_frame_id:=world \
   ground_truth_base_frame_id:=base_link_gt

$ ros2 bag play rosbag2_2025_09_21-16_52_40 --clock

matlabbe avatar Sep 22 '25 00:09 matlabbe