rtabmap_ros icon indicating copy to clipboard operation
rtabmap_ros copied to clipboard

Remote mapping issue on ROS2 humble, D435i

Open DonghanChen opened this issue 8 months ago • 10 comments

Hi, I have successfully implemented mapping and navigation on a single machine, but I encountered some difficulties when starting rtabmap remotely.

My environment: Ubuntu 22.04 ROS2 Humble D435i Standalone IMU Launched nodes:

  1. publish_static_transforms_node: Used to publish static transforms between the sensors and the base.

  2. Realsense2 Camera Node: ros2 launch realsense2_camera rs_launch_nav.py align_depth:=true enable_gyro:=false enable_accel:=false rgb_camera.color_profile:=640x480x30 depth_module.depth_profile:=640x480x30 clip_distance:=5.0 pointcloud.enable:=true align_depth.enable:=true enable_rgbd:=true enable_sync:=true camera_name:=camera_chest usb_port_id:=1-4.3

  3. IMU Filter: ros2 launch imu_filter_madgwick imu_filter.launch.py _use_mag:=false _publish_tf:=false _world_frame:="enu" /imu/data_raw:=/imu /imu/data:=/rtabmap/imu

  4. Fusion of rtabmap’s VO with IMU (Re-publishing): ros2 launch publish_static_transforms ekf_launch.py odom1:=visual_odometry

  5. rtabmap Launch: (Note: I am not clear on how the parameters for approx_sync, rgbd_sync, approx_rgbd_sync, and compressed should be distributed.) ros2 launch rtabmap_launch rtabmap.launch.py rtabmap_args:="--delete_db_on_start" rgb_topic:=/camera/camera_chest/color/image_raw depth_topic:=/camera/camera_chest/aligned_depth_to_color/image_raw camera_info_topic:=/camera/camera_chest/aligned_depth_to_color/camera_info frame_id:=base_link use_sim_time:=false approx_sync:=true qos:=2 rviz:=false queue_size:=100 topic_queue_size:=200 wait_imu_to_init:=false subscribe_scan_cloud:=false rgbd_sync:=true approx_rgbd_sync:=true compressed:=true localization:=false visual_odometry:=true icp_odometry:=false odom_topic:=visual_odometry publish_tf_odom:=false use_action_for_goal:=false imu_topic:=/rtabmap/imu odom_guess_frame_id:=odom odom_frame_id:=odom

Question regarding node launching: My understanding is that the sensor nodes should run on the robot and rtabmap.launch and rviz2 should run on the laptop (due to remote visualization). However, I am uncertain about where it is more appropriate to launch the static transform publisher, robot_localization, and IMU filter nodes.

for my try, only realsense and imu node run on robot, others on my laptop:

ros2 launch rtabmap_launch rtabmap.launch.py rtabmap_args:="--delete_db_on_start" rgb_topic:=/camera/camera_chest/color/image_raw depth_topic:=/camera/camera_chest/aligned_depth_to_color/image_raw camera_info_topic:=/camera/camera_chest/aligned_depth_to_color/camera_info frame_id:=base_link use_sim_time:=false approx_sync:=true qos:=2 rviz:=false queue_size:=100 topic_queue_size:=200 wait_imu_to_init:=false subscribe_scan_cloud:=false rgbd_sync:=true approx_rgbd_sync:=true compressed:=true localization:=false visual_odometry:=true icp_odometry:=false odom_topic:=visual_odometry publish_tf_odom:=false use_action_for_goal:=false imu_topic:=/rtabmap/imu odom_guess_frame_id:=odom odom_frame_id:=odom [INFO] [launch]: All log files can be found below /home/lenovo/.ros/log/2025-04-10-17-58-23-061216-lenovo-ThinkStation-P3-Ultra-51117 [INFO] [launch]: Default logging verbosity is set to INFO [INFO] [rgbd_sync-3]: process started with pid [51144] [INFO] [rgbd_odometry-4]: process started with pid [51146] [INFO] [rtabmap-5]: process started with pid [51148] [INFO] [rtabmap_viz-6]: process started with pid [51150] [INFO] [republish-1]: process started with pid [51140] [INFO] [republish-2]: process started with pid [51142] [rgbd_sync-3] [INFO] [1744279104.756214527] [rtabmap.rgbd_sync]: rgbd_sync: approx_sync = true [rgbd_sync-3] [INFO] [1744279104.757219206] [rtabmap.rgbd_sync]: rgbd_sync: approx_sync_max_interval = 0.000000 [rgbd_sync-3] [INFO] [1744279104.757367682] [rtabmap.rgbd_sync]: rgbd_sync: topic_queue_size = 200 [rgbd_sync-3] [INFO] [1744279104.757434267] [rtabmap.rgbd_sync]: rgbd_sync: sync_queue_size = 100 [rgbd_sync-3] [INFO] [1744279104.757506660] [rtabmap.rgbd_sync]: rgbd_sync: qos = 2 [rgbd_sync-3] [INFO] [1744279104.757888031] [rtabmap.rgbd_sync]: rgbd_sync: qos_camera_info = 2 [rgbd_sync-3] [INFO] [1744279104.758155023] [rtabmap.rgbd_sync]: rgbd_sync: depth_scale = 1.000000 [rgbd_sync-3] [INFO] [1744279104.758261636] [rtabmap.rgbd_sync]: rgbd_sync: decimation = 1 [rgbd_sync-3] [INFO] [1744279104.758338638] [rtabmap.rgbd_sync]: rgbd_sync: compressed_rate = 0.000000 [rgbd_odometry-4] [INFO] [1744279104.880252405] [rtabmap.rgbd_odometry]: Odometry: frame_id = base_link [rgbd_odometry-4] [INFO] [1744279104.881094965] [rtabmap.rgbd_odometry]: Odometry: odom_frame_id = visual_odometry [rgbd_odometry-4] [INFO] [1744279104.881198133] [rtabmap.rgbd_odometry]: Odometry: publish_tf = false [rgbd_odometry-4] [INFO] [1744279104.881256047] [rtabmap.rgbd_odometry]: Odometry: wait_for_transform = 0.200000 [rgbd_odometry-4] [INFO] [1744279104.881362217] [rtabmap.rgbd_odometry]: Odometry: log_to_rosout_level = 4 [rgbd_odometry-4] [INFO] [1744279104.881520055] [rtabmap.rgbd_odometry]: Odometry: initial_pose = xyz=0.000000,0.000000,0.000000 rpy=0.000000,-0.000000,0.000000 [rgbd_odometry-4] [INFO] [1744279104.881587785] [rtabmap.rgbd_odometry]: Odometry: ground_truth_frame_id = [rgbd_odometry-4] [INFO] [1744279104.881635930] [rtabmap.rgbd_odometry]: Odometry: ground_truth_base_frame_id = [rgbd_odometry-4] [INFO] [1744279104.881686382] [rtabmap.rgbd_odometry]: Odometry: config_path = [rgbd_odometry-4] [INFO] [1744279104.881731326] [rtabmap.rgbd_odometry]: Odometry: publish_null_when_lost = true [rgbd_odometry-4] [INFO] [1744279104.881787118] [rtabmap.rgbd_odometry]: Odometry: publish_compressed_sensor_data = false [rgbd_odometry-4] [INFO] [1744279104.881845543] [rtabmap.rgbd_odometry]: Odometry: guess_frame_id = odom [rgbd_odometry-4] [INFO] [1744279104.881907921] [rtabmap.rgbd_odometry]: Odometry: guess_min_translation = 0.000000 [rgbd_odometry-4] [INFO] [1744279104.881972710] [rtabmap.rgbd_odometry]: Odometry: guess_min_rotation = 0.000000 [rgbd_odometry-4] [INFO] [1744279104.882025429] [rtabmap.rgbd_odometry]: Odometry: guess_min_time = 0.000000 [rgbd_odometry-4] [INFO] [1744279104.882089544] [rtabmap.rgbd_odometry]: Odometry: expected_update_rate = 0.000000 Hz [rgbd_odometry-4] [INFO] [1744279104.882143130] [rtabmap.rgbd_odometry]: Odometry: max_update_rate = 0.000000 Hz [rgbd_odometry-4] [INFO] [1744279104.882199582] [rtabmap.rgbd_odometry]: Odometry: min_update_rate = 0.000000 Hz [rgbd_odometry-4] [INFO] [1744279104.882246724] [rtabmap.rgbd_odometry]: Odometry: wait_imu_to_init = false [rgbd_odometry-4] [INFO] [1744279104.882292678] [rtabmap.rgbd_odometry]: Odometry: always_check_imu_tf = true [rgbd_odometry-4] [INFO] [1744279104.882334878] [rtabmap.rgbd_odometry]: Odometry: sensor_data_compression_format = .jpg [rgbd_odometry-4] [INFO] [1744279104.882379981] [rtabmap.rgbd_odometry]: Odometry: sensor_data_parallel_compression = true [rgbd_odometry-4] [INFO] [1744279104.882789279] [rtabmap.rgbd_odometry]: Odometry: stereoParams_=0 visParams_=1 icpParams_=0 [rgbd_sync-3] [INFO] [1744279104.893024555] [rtabmap.rgbd_sync]: [rgbd_sync-3] rgbd_sync subscribed to (approx sync): [rgbd_sync-3] /camera/camera_chest/color/image_raw_relay, [rgbd_sync-3] /camera/camera_chest/aligned_depth_to_color/image_raw_relay, [rgbd_sync-3] /camera/camera_chest/aligned_depth_to_color/camera_info [rtabmap-5] [INFO] [1744279104.961960427] [rtabmap.rtabmap]: rtabmap(maps): latch = true [rtabmap-5] [INFO] [1744279104.962918242] [rtabmap.rtabmap]: rtabmap(maps): map_filter_radius = 0.000000 [rtabmap-5] [INFO] [1744279104.963473825] [rtabmap.rtabmap]: rtabmap(maps): map_filter_angle = 30.000000 [rtabmap-5] [INFO] [1744279104.963630749] [rtabmap.rtabmap]: rtabmap(maps): map_cleanup = true [rtabmap-5] [INFO] [1744279104.963666541] [rtabmap.rtabmap]: rtabmap(maps): map_always_update = false [rtabmap-5] [INFO] [1744279104.963702440] [rtabmap.rtabmap]: rtabmap(maps): map_empty_ray_tracing = true [rtabmap-5] [INFO] [1744279104.963744690] [rtabmap.rtabmap]: rtabmap(maps): cloud_output_voxelized = true [rtabmap-5] [INFO] [1744279104.963775225] [rtabmap.rtabmap]: rtabmap(maps): cloud_subtract_filtering = false [rtabmap-5] [INFO] [1744279104.963806468] [rtabmap.rtabmap]: rtabmap(maps): cloud_subtract_filtering_min_neighbors = 2 [rtabmap-5] [INFO] [1744279104.964263705] [rtabmap.rtabmap]: rtabmap(maps): octomap_tree_depth = 16 [rtabmap-5] [INFO] [1744279105.101830067] [rtabmap.rtabmap]: rtabmap: frame_id = base_link [rtabmap-5] [INFO] [1744279105.101996724] [rtabmap.rtabmap]: rtabmap: odom_frame_id = odom [rtabmap-5] [INFO] [1744279105.102037289] [rtabmap.rtabmap]: rtabmap: map_frame_id = map [rtabmap-5] [INFO] [1744279105.102135168] [rtabmap.rtabmap]: rtabmap: log_to_rosout_level = 4 [rtabmap-5] [INFO] [1744279105.102169244] [rtabmap.rtabmap]: rtabmap: initial_pose = [rtabmap-5] [INFO] [1744279105.102200419] [rtabmap.rtabmap]: rtabmap: use_action_for_goal = false [rtabmap-5] [INFO] [1744279105.102246555] [rtabmap.rtabmap]: rtabmap: tf_delay = 0.050000 [rtabmap-5] [INFO] [1744279105.102297507] [rtabmap.rtabmap]: rtabmap: tf_tolerance = 0.100000 [rtabmap-5] [INFO] [1744279105.102330103] [rtabmap.rtabmap]: rtabmap: odom_sensor_sync = false [rtabmap-5] [INFO] [1744279105.102361267] [rtabmap.rtabmap]: rtabmap: pub_loc_pose_only_when_localizing = false [rtabmap-5] [INFO] [1744279105.102391324] [rtabmap.rtabmap]: rtabmap: wait_for_transform = 0.200000 [rtabmap-5] [INFO] [1744279105.102447923] [rtabmap.rtabmap]: rtabmap: gen_scan = false [rtabmap-5] [INFO] [1744279105.102479333] [rtabmap.rtabmap]: rtabmap: gen_depth = false [rgbd_odometry-4] [INFO] [1744279105.128738119] [rtabmap.rgbd_odometry]: RGBDOdometry: approx_sync = true [rgbd_odometry-4] [INFO] [1744279105.128911985] [rtabmap.rgbd_odometry]: RGBDOdometry: approx_sync_max_interval = 0.000000 [rgbd_odometry-4] [INFO] [1744279105.128993451] [rtabmap.rgbd_odometry]: RGBDOdometry: topic_queue_size = 200 [rgbd_odometry-4] [INFO] [1744279105.129036879] [rtabmap.rgbd_odometry]: RGBDOdometry: sync_queue_size = 100 [rgbd_odometry-4] [INFO] [1744279105.129078968] [rtabmap.rgbd_odometry]: RGBDOdometry: qos = 2 [rgbd_odometry-4] [INFO] [1744279105.129124285] [rtabmap.rgbd_odometry]: RGBDOdometry: qos_camera_info = 2 [rgbd_odometry-4] [INFO] [1744279105.129174003] [rtabmap.rgbd_odometry]: RGBDOdometry: subscribe_rgbd = true [rgbd_odometry-4] [INFO] [1744279105.129218450] [rtabmap.rgbd_odometry]: RGBDOdometry: rgbd_cameras = 1 [rgbd_odometry-4] [INFO] [1744279105.129282844] [rtabmap.rgbd_odometry]: RGBDOdometry: keep_color = false [rgbd_odometry-4] [INFO] [1744279105.134172702] [rtabmap.rgbd_odometry]: [rgbd_odometry-4] rgbd_odometry subscribed to: [rgbd_odometry-4] /rtabmap/rgbd_image [rtabmap-5] [INFO] [1744279105.177352766] [rtabmap.rtabmap]: Setting RTAB-Map parameter "Grid/MaxGroundHeight"="-1.0" [rtabmap-5] [INFO] [1744279105.177736515] [rtabmap.rtabmap]: Setting RTAB-Map parameter "Grid/MaxObstacleHeight"="0.9" [rtabmap-5] [INFO] [1744279105.193968613] [rtabmap.rtabmap]: Setting RTAB-Map parameter "Mem/IncrementalMemory"="true" [rtabmap-5] [INFO] [1744279105.194227027] [rtabmap.rtabmap]: Setting RTAB-Map parameter "Mem/InitWMWithAllNodes"="false" [rtabmap-5] [INFO] [1744279105.208179929] [rtabmap.rtabmap]: Setting RTAB-Map parameter "Reg/Force3DoF"="false" [rtabmap-5] [INFO] [1744279105.229759716] [rtabmap.rtabmap]: RTAB-Map detection rate = 1.000000 Hz [rtabmap-5] [INFO] [1744279105.230399584] [rtabmap.rtabmap]: rtabmap: Deleted database "/home/lenovo/.ros/rtabmap.db" (--delete_db_on_start or -d are set). [rtabmap-5] [INFO] [1744279105.230485525] [rtabmap.rtabmap]: rtabmap: Using database from "/home/lenovo/.ros/rtabmap.db" (0 MB). [rtabmap-5] [INFO] [1744279105.487230257] [rtabmap.rtabmap]: rtabmap: Database version = "0.21.9". [rtabmap-5] [INFO] [1744279105.487427223] [rtabmap.rtabmap]: rtabmap: SLAM mode (Mem/IncrementalMemory=true) [rtabmap_viz-6] [INFO] [1744279105.575713025] [rtabmap.rtabmap_viz]: rtabmap_viz: Using configuration from "/home/lenovo/.ros/rtabmapGUI.ini" [rtabmap-5] [INFO] [1744279105.576567228] [rtabmap.rtabmap]: Setup callbacks [rtabmap-5] [WARN] [1744279105.576863296] [rtabmap.rtabmap]: rtabmap: Parameters subscribe_depth and subscribe_rgbd cannot be true at the same time. Parameter subscribe_depth is set to false. [rtabmap-5] [INFO] [1744279105.577154782] [rtabmap.rtabmap]: rtabmap: subscribe_depth = false [rtabmap-5] [INFO] [1744279105.577201665] [rtabmap.rtabmap]: rtabmap: subscribe_rgb = false [rtabmap-5] [INFO] [1744279105.577229329] [rtabmap.rtabmap]: rtabmap: subscribe_stereo = false [rtabmap-5] [INFO] [1744279105.577256035] [rtabmap.rtabmap]: rtabmap: subscribe_rgbd = true (rgbd_cameras=1) [rtabmap-5] [INFO] [1744279105.577285097] [rtabmap.rtabmap]: rtabmap: subscribe_sensor_data = false [rtabmap-5] [INFO] [1744279105.577317865] [rtabmap.rtabmap]: rtabmap: subscribe_odom_info = true [rtabmap-5] [INFO] [1744279105.577356787] [rtabmap.rtabmap]: rtabmap: subscribe_user_data = false [rtabmap-5] [INFO] [1744279105.577399027] [rtabmap.rtabmap]: rtabmap: subscribe_scan = false [rtabmap-5] [INFO] [1744279105.577427870] [rtabmap.rtabmap]: rtabmap: subscribe_scan_cloud = false [rtabmap-5] [INFO] [1744279105.577598711] [rtabmap.rtabmap]: rtabmap: subscribe_scan_descriptor = false [rtabmap-5] [INFO] [1744279105.577641069] [rtabmap.rtabmap]: rtabmap: topic_queue_size = 200 [rtabmap-5] [INFO] [1744279105.577678517] [rtabmap.rtabmap]: rtabmap: sync_queue_size = 100 [rtabmap-5] [INFO] [1744279105.577703721] [rtabmap.rtabmap]: rtabmap: qos_image = 2 [rtabmap-5] [INFO] [1744279105.577731381] [rtabmap.rtabmap]: rtabmap: qos_camera_info = 2 [rtabmap-5] [INFO] [1744279105.577759390] [rtabmap.rtabmap]: rtabmap: qos_scan = 2 [rtabmap-5] [INFO] [1744279105.577790550] [rtabmap.rtabmap]: rtabmap: qos_odom = 2 [rtabmap-5] [INFO] [1744279105.577815531] [rtabmap.rtabmap]: rtabmap: qos_user_data = 2 [rtabmap-5] [INFO] [1744279105.577840570] [rtabmap.rtabmap]: rtabmap: approx_sync = true [rtabmap-5] [INFO] [1744279105.577923744] [rtabmap.rtabmap]: Setup rgbd callback [rtabmap-5] [INFO] [1744279105.593823898] [rtabmap.rtabmap]: [rtabmap-5] rtabmap subscribed to (approx sync): [rtabmap-5] /rtabmap/rgbd_image
[rtabmap-5] /rtabmap/odom_info [rtabmap_viz-6] libpng warning: iCCP: known incorrect sRGB profile [rtabmap_viz-6] libpng warning: iCCP: known incorrect sRGB profile [rtabmap_viz-6] libpng warning: iCCP: known incorrect sRGB profile [rtabmap_viz-6] [WARN] [1744279107.568766559] [rcl.logging_rosout]: Publisher already registered for provided node name. If this is due to multiple nodes with the same name then all logs for that logger name will go out over the existing publisher. As soon as any node with that name is destructed it will unregister the publisher, preventing any further logs for that name from being published on the rosout topic. [rtabmap_viz-6] [INFO] [1744279107.580981712] [rtabmap.rtabmap_viz]: Reading parameters from the ROS server... [rtabmap_viz-6] [INFO] [1744279107.753420904] [rtabmap.rtabmap_viz]: Parameters read = 383 [rtabmap_viz-6] [INFO] [1744279107.753623452] [rtabmap.rtabmap_viz]: Parameters successfully read. [rtabmap_viz-6] [WARN] [1744279108.475869849] [rtabmap.rtabmap_viz]: rtabmap: Parameters subscribe_depth and subscribe_rgbd cannot be true at the same time. Parameter subscribe_depth is set to false. [rtabmap_viz-6] [INFO] [1744279108.476356402] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_depth = false [rtabmap_viz-6] [INFO] [1744279108.476401643] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_rgb = false [rtabmap_viz-6] [INFO] [1744279108.476432728] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_stereo = false [rtabmap_viz-6] [INFO] [1744279108.476464582] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_rgbd = true (rgbd_cameras=1) [rtabmap_viz-6] [INFO] [1744279108.476526841] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_sensor_data = false [rtabmap_viz-6] [INFO] [1744279108.476557065] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_odom_info = true [rtabmap_viz-6] [INFO] [1744279108.476577657] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_user_data = false [rtabmap_viz-6] [INFO] [1744279108.476589759] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_scan = false [rtabmap_viz-6] [INFO] [1744279108.476606225] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_scan_cloud = false [rtabmap_viz-6] [INFO] [1744279108.476618344] [rtabmap.rtabmap_viz]: rtabmap_viz: subscribe_scan_descriptor = false [rtabmap_viz-6] [INFO] [1744279108.476632597] [rtabmap.rtabmap_viz]: rtabmap_viz: topic_queue_size = 200 [rtabmap_viz-6] [INFO] [1744279108.476651685] [rtabmap.rtabmap_viz]: rtabmap_viz: sync_queue_size = 100 [rtabmap_viz-6] [INFO] [1744279108.476667688] [rtabmap.rtabmap_viz]: rtabmap_viz: qos_image = 2 [rtabmap_viz-6] [INFO] [1744279108.476684534] [rtabmap.rtabmap_viz]: rtabmap_viz: qos_camera_info = 2 [rtabmap_viz-6] [INFO] [1744279108.476698332] [rtabmap.rtabmap_viz]: rtabmap_viz: qos_scan = 2 [rtabmap_viz-6] [INFO] [1744279108.476711314] [rtabmap.rtabmap_viz]: rtabmap_viz: qos_odom = 2 [rtabmap_viz-6] [INFO] [1744279108.476732873] [rtabmap.rtabmap_viz]: rtabmap_viz: qos_user_data = 2 [rtabmap_viz-6] [INFO] [1744279108.476748553] [rtabmap.rtabmap_viz]: rtabmap_viz: approx_sync = true [rtabmap_viz-6] [INFO] [1744279108.476824183] [rtabmap.rtabmap_viz]: Setup rgbd callback [rtabmap_viz-6] [INFO] [1744279108.491751217] [rtabmap.rtabmap_viz]: [rtabmap_viz-6] rtabmap_viz subscribed to (approx sync): [rtabmap_viz-6] /rtabmap/rgbd_image
[rtabmap_viz-6] /rtabmap/odom_info [rtabmap_viz-6] [INFO] [1744279108.500613215] [rtabmap.rtabmap_viz]: rtabmap_viz started. [rgbd_sync-3] [WARN] [1744279109.903039058] [rtabmap.rgbd_sync]: rgbd_sync: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set. Ajusting topic_queue_size (200) and sync_queue_size (100) can also help for better synchronization if framerates and/or delays are different. [rgbd_sync-3] rgbd_sync subscribed to (approx sync): [rgbd_sync-3] /camera/camera_chest/color/image_raw_relay, [rgbd_sync-3] /camera/camera_chest/aligned_depth_to_color/image_raw_relay, [rgbd_sync-3] /camera/camera_chest/aligned_depth_to_color/camera_info [rgbd_odometry-4] [WARN] [1744279110.143494648] [rtabmap.rgbd_odometry]: rgbd_odometry: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set. [rgbd_odometry-4] rgbd_odometry subscribed to: [rgbd_odometry-4] /rtabmap/rgbd_image [rtabmap-5] [WARN] [1744279110.603173261] [rtabmap.rtabmap]: rtabmap: Did not receive data since 5 seconds! Make sure the input topics are published ("$ ros2 topic hz my_topic") and the timestamps in their header are set. If topics are coming from different computers, make sure the clocks of the computers are synchronized ("ntpdate"). Ajusting topic_queue_size (200) and sync_queue_size (100) can also help for better synchronization if framerates and/or delays are different. If topics are not published at the same rate, you could increase "sync_queue_size" and/or "topic_queue_size" parameters (current=100 and 200 respectively). [rtabmap-5] rtabmap subscribed to (approx sync): [rtabmap-5] /rtabmap/rgbd_image
[rtabmap-5] /rtabmap/odom_info [rtabmap_viz-6] [WARN] [1744279113.500380021] [rtabmap.rtabmap_viz]: rtabmap_viz: Did not receive data since 5 seconds! Make sure the input topics are published ("$ ros2 topic hz my_topic") and the timestamps in their header are set. If topics are coming from different computers, make sure the clocks of the computers are synchronized ("ntpdate"). Ajusting topic_queue_size (200) and sync_queue_size (100) can also help for better synchronization if framerates and/or delays are different. If topics are not published at the same rate, you could increase "sync_queue_size" and/or "topic_queue_size" parameters (current=100 and 200 respectively). [rtabmap_viz-6] rtabmap_viz subscribed to (approx sync): [rtabmap_viz-6] /rtabmap/rgbd_image
[rtabmap_viz-6] /rtabmap/odom_info

RTABMAP_viz shows nothing

Image

Yesterday, when I tried, I used the parameter compress:false and I don’t remember how I set the sync-related parameters. rtabmap_viz and rviz worked properly, but only for a few seconds (displaying the 3D point cloud, RGB, and depth images). After that, they stopped updating, and the logs showed warnings such as Did not receive data since 5 seconds. I assumed that the issue was due to remote operation not compressing the images. Today, I tried setting compress:true and attempted to adjust the sync-related parameters, but I haven’t been able to get it to run properly even once.

and

ros2 topic hz /camera/camera_chest/aligned_depth_to_color/image_raw RATE 17

ros2 topic hz /camera/camera_chest/aligned_depth_to_color/image_raw/compressed RATE 23

ros2 topic hz /camera/camera_chest/aligned_depth_to_color/image_raw_relay RATE24

ros2 topic hz /camera/camera_chest/aligned_depth_to_color/image_raw/compressedDepth RATE 24

ros2 topic hz /rtabmap/rgbd_image NOTHING

ros2 topic hz /rtabmap/rgbd_image/compressed NOTHING

Any guidance would be greatly appreciated.

DonghanChen avatar Apr 10 '25 10:04 DonghanChen

Hi, I just reproduced the situation from yesterday when using compressed:false and setting approx_sync:=true, rgbd_sync:=true, and approx_rgbd_sync:=true.

As shown in the figure, most of the time warnings are being reported, and only occasionally does it run properly for one or two frames. Image Image

Additionally:

ros2 topic hz /rtabmap/rgbd_image WARNING: topic [/rtabmap/rgbd_image] does not appear to be published yet average rate: 0.108 min: 0.027s max: 64.098s std dev: 22.40321s window: 7 average rate: 0.147 min: 0.027s max: 64.098s std dev: 17.68351s window: 14 average rate: 0.159 min: 0.027s max: 64.098s std dev: 16.05701s window: 20

ros2 topic hz /rtabmap/rgbd_image/compressed average rate: 0.228 min: 0.043s max: 30.300s std dev: 10.57953s window: 7 average rate: 0.212 min: 0.043s max: 30.409s std dev: 10.92865s window: 13 average rate: 0.206 min: 0.043s max: 30.409s std dev: 11.05138s window: 19 average rate: 0.220 min: 0.043s max: 30.409s std dev: 10.75164s window: 27

DonghanChen avatar Apr 10 '25 10:04 DonghanChen

Are you doing remote mapping just because you want to visualize remotely, not because the robot doesn't have enough power to do mapping itself? If it is the case, launch all nodes on the robot except rtabmap_viz. You can start rtabmap_viz on the laptop like this:

ros2 run rtabmap_viz rtabmap_viz -r __ns:=rtabmap

Setting the namespace in which rtabmap node is started so that topics and parameters are connected by default.

From the warning message, it seems rgbd_odometry doesn't receive /rtabmap/rgbd_image for some time, maybe network latency/bandwidth issue. Were your ros2 topic hz done on the robot or the laptop? at the same time than rgbd_odometry warnings?

matlabbe avatar Apr 13 '25 19:04 matlabbe

Hi, thank you very much for your suggestions—they solved my visualization and mapping issues.

I’ve encountered a new issue with localization. According to other issues, I am using the robot_localization package in the localization module. This works well on my wheeled robot because my robot’s chassis provides wheel encoder odometry. By fusing visual/ICP odometry with the wheel encoder odometry, I can achieve stable and robust localization.

However, this approach cannot be applied to my humanoid robot since it is bipedal and lacks wheel encoders—it only has an IMU. When I fuse the filtered IMU data (using the imu_filter_madgwick package) with visual odometry (VO), the robot’s localization continuously drifts. I am quite sure that the drift is caused by the IMU because when I publish only VO, there is no drift; or if I set the IMU’s ax parameter to false, the drift occurs only in the X direction.

here is my ekf.yaml config:

ekf_filter_node:
  ros__parameters:
    frequency: 30.0  # publishing frequency
    sensor_timeout: 0.2
    two_d_mode: false   # set to true if operating exclusively on a plane, otherwise false
    map_frame: map
    odom_frame: odom_nav
    base_link_frame: base_link
    world_frame: odom_nav  # often 'odom' for EKF
    publish_tf: true   # allow robot_localization to publish the TF from odom to base_link

    # publish the fused odometry
    publish_filtered_odometry: true
    publish_filtered_transform: true

    # # subscribe to wheel odometry
    # odom0: "wheel_odom"  # topic name for wheel odometry
    # odom0_config: 
    #   [true,  true,  true,  # x, y, z
    #    true, true, true,   # roll, pitch, yaw
    #    true, true, true,  # vx, vy, vz
    #    false, false, false,  # vroll, vpitch, vyaw
    #    false, false, false]  # ax, ay, az

    # odom0_queue_size: 1
    # odom0_nodelay: false
    # odom0_differential: false
    # odom0_relative: false

    # subscribe to visual odometry
    odom1: "visual_odometry"  # topic name for visual odometry, adjust according to your setup
    odom1_config: 
      [true,  true,  true,  # x, y, z
       false, false, false,   # roll, pitch, yaw
       true, true, true,  # vx, vy, vz
       true, true, true,  # vroll, vpitch, vyaw
       false, false, false]  # ax, ay, az

    odom1_queue_size: 2
    odom1_nodelay: false
    odom1_differential: false
    odom1_relative: false
    odom1_pose_rejection_threshold: 0.8
    odom1_twist_rejection_threshold: 0.8

    # use_control: true
    # stamped_control: false
    # control_config: [false, false, false,  # x, y, z
    #                   false, false, true,  # roll, pitch, yaw
    #                   true,  false,  false]   # vx, vy, vz

     # ===============  IMU Input (imu0)  ===============
    imu0: /rtabmap/imu       # <-- enter your IMU topic here
    imu0_config: [false, false, false,  # [x, y, z,
                  true,  true,  true,   #  roll, pitch, yaw,
                  false, false, false,  #  vx, vy, vz,
                  false,  false,  true,   #  vroll, vpitch, vyaw,
                  true, false, false]     #  ax, ay, az]

    imu0_differential: false
    imu0_relative: false
    imu0_queue_size: 5
    imu0_pose_rejection_threshold: 0.2
    imu0_twist_rejection_threshold: 0.2
    imu0_linear_acceleration_rejection_threshold: 0.8
    imu0_remove_gravitational_acceleration: true
    # if gravitational acceleration is not removed, you need to handle it yourself or use gravity-compensated data from the IMU driver

    # ===============  Other General Settings  ===============
    # whether to publish acceleration information
    publish_acceleration: false

    # covariance initialization (optional)
    # process_noise_covariance, initial_estimate_covariance, etc.
    # can also be set according to the actual sensor accuracy

    # QoS settings (ensure that the publisher and subscriber QoS are consistent)
    # odom0_qos:
    #   reliability: reliable
    #   durability: volatile
    odom1_qos:
      reliability: reliable
      durability: volatile
    imu0_qos:
      reliability: reliable
      durability: volatile

    # other parameters
    # odom0_pose_rejection_threshold: 5.0
    # odom0_twist_rejection_threshold: 1.0


    process_noise_covariance: [0.05, 0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.05, 0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.06, 0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.03, 0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.03, 0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.06, 0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.1, 0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.025, 0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.04, 0.0,    0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.01, 0.0,    0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.01, 0.0,    0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.02, 0.0,    0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.01, 0.0,    0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.01, 0.0,
                                   0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.0,     0.0,     0.0,    0.0,    0.0,    0.0,    0.0,    0.0,    0.015]

use_control is not used because the chassis control module has not been integrated yet. I tried setting the process_noise_covariance for ax to a very small value (thus reducing the trust in the sensor's ax, if I understood correctly), but the robot still accumulates slight drift in the X direction.

So, should I avoid using IMU data when only an IMU is available and there is no wheel encoder? Also, I am not sure whether the IMU has been calibrated, as I was not present during the robot's installation. However, I suspect that calibration cannot completely eliminate errors; even if the errors are very small, the continuous accumulation in acceleration could eventually lead to divergence in localization. Should I also set ax to false? I set ay to false with the consideration that, in most cases, the robot will not autonomously produce lateral movements. But if I disregard the translational acceleration from the IMU, then what is the point of using it? Does it only contribute to roll, pitch, and yaw?

Additionally, could this issue be related to the difference in publishing frequencies between the IMU and VO? The IMU publishes very fast, whereas the VO publishes at a slower rate.

DonghanChen avatar Apr 16 '25 12:04 DonghanChen

Fusing the IMU's acceleration directly can be tricky and would drift most of the time. The accelerometer is mostly useful to estimate absolute IMU's orientation in roll and pitch. You are using the Madgwick filter already that is fusing the acceleration, I would just send the roll/pitch/yaw to robot_localization.

The X-Y-Z components would then be computed solely from VO.

matlabbe avatar Apr 19 '25 23:04 matlabbe

Hi, thank you very much for your suggestions.

This week I tried disabling acceleration fusion and enabled orientation fusion, but once mapping starts the odometry was drifting in orientation. For the VO and IMU inputs, should I set both the orientation and angular velocity to true? By the way, I’m still using the default F2M odometry strategy. My understanding is that in ekf.yaml, for VO I should set odom_relative: true, and for the IMU I should set imu0_differential: true. Is that correct?

I have another question about my setup. The robot is a bipedal humanoid, and its sensors are rigidly mounted to base_link. While the robot walking, the robot inevitably experiences small roll and pitch motions (they seem minor in my observations). Because of this I set force3DOF: false , I worry that limiting the system to 3 DOF might let small tilts introduce pose errors in ICP or visual registration. However, we currently operate only on flat indoor floors. Is force3DOF intended to handle body wobble, unflat floor, or both? I’m not sure whether keeping it false is the right choice.

Finally, I’ve been using the default F2M odometry strategy, but rtabmap supports many strategies. Which one would you recommend for my case?

Robot: humanoid, bipedal walking Sensors: RGB-D camera, LiDAR, IMU, fisheye camera Environment: small- to medium-scale indoor, flat floors Computation: ample resources available

Sorry for the many questions, and thank you again for your help!

DonghanChen avatar Apr 27 '25 08:04 DonghanChen

When I use only the visual odometry (VO) from RTAB-Map, the mapping results are quite good. However, once I fuse the IMU data using robot_localization, the pose during the mapping phase starts to drift significantly in orientation.

Then I realized that RTAB-Map also uses the IMU during mapping. In the example launch files, there is an imu_topic argument with the description: "Used with VIO approaches and for SLAM graph optimization (gravity constraints)."

This topic is subscribed by the rgbd_odometry node. However, I couldn't find the relevant part in the source code that shows exactly how the IMU data is used, so I'm not sure what kind of integration is being performed.

That brings me to my question: if RTAB-Map is already using the IMU data, is it redundant or even problematic to fuse the IMU again using robot_localization? Could this be the cause of the orientation drift I'm seeing?

DonghanChen avatar Apr 29 '25 07:04 DonghanChen

The IMU is used by rgbd_odometry to help feature tracking (by setting a guess with predicted rotation) and also to add gravity constraints in local bundle adjustment (F2M). So the IMU is used more as input to VO, it is not really fused with VO (well maybe a little as constraints in local bundle). Because it is VO first, very fast rotation or losing track of features can still make the odometry lost, unlike VIO approaches that can estimate the pose for "some time" only with IMU while we lose track of visual features. VIO approaches may also provide a pose at higher frequency than the camera frequency. Note that to make rgbd_odometry use the imu, wait_imu_to_init parameter should be true.

For the EKF, the IMU's orientation would drift in yaw. When fusing with VO, it may make the whole output drift in Yaw.

For the question about if you should enable or not Reg/Force3DoF, I think you should keep it to false. If your base_link frame can have some roll and pitch, you should not force 3DoF.

matlabbe avatar May 02 '25 06:05 matlabbe

Hi Mathieu,

Thanks for your reply.

I've temporarily given up on fusing VO and IMU because it continually results in drift. Since I have access to both RGB-D cameras and LiDAR, I'm currently attempting a more robust localization by fusing VO and ICP. I've managed to fuse these two odometry sources in robot_localization by remapping their topics and disabling TF publishing. Unfortunately, I'm still facing oscillation issues in the fused odometry output.

In issue #1263, regarding a similar question on fusing VO and ICP, your recommendation was to switch to an alternate odometry source when one fails, rather than directly fusing them. Does this imply that directly fusing VO and ICP in robot_localization is not advisable? If fusion can still be a viable option, could you please offer some guidance on how to avoid the oscillation issue when combining multiple odometry inputs?

DonghanChen avatar May 09 '25 03:05 DonghanChen

Hi Mathieu,

Regarding the simultaneous use of LiDAR and RGB-D, I've done some additional research. Based on my understanding, I summarized four potential solutions (I'm not sure if I understood them correctly, and perhaps these approaches can be combined?):

  1. Using the Reg/Strategy parameter RTAB-Map provides a Reg/Strategy parameter (2=Visual+ICP). Does this mean RTAB-Map itself can directly fuse VO and ICP odometry without the need for an additional robot_localization module? If so, how should I properly configure it? RTAB-Map seems to accept only one odometry topic; if I enable publish_tf, wouldn't simultaneously publishing VO and ICP TF cause conflicts?

  2. Approach mentioned in the RTAB-Map tutorial According to the [Kinect + Odometry + 2D laser tutorial] http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot , seems that RGB-D and LiDAR can be used simultaneously without the robot_localization module. My understanding of this solution is that RGB-D is the primarily odometry, while LiDAR ICP acts as a supplementary method, especially for loop closures when the robot returns in the opposite direction. And also, LiDAR can provide a more robust OctoMap.

  3. Using robot_localization to fuse RTAB-Map’s VO and ICP odometry As mentioned in my previous comment, this involves fusing VO and ICP odometry within robot_localization and then feeding the resulting combined odometry back to RTAB-Map.

  4. The alternative approach mentioned in issue #1263 You previously suggested using VO or ICP odometry individually, where one acts as a fallback if the other fails.

By the way, since my bipedal robot cannot provide stable wheel odometry, I have to rely on multi-sensor fusion for robust localization.

DonghanChen avatar May 09 '25 11:05 DonghanChen

robot_localization can only fuse outputs of each odometry sub-systems in a loosely-coupled way, meaning that lidar cannot help visual odometry or vice-versa, with strong assumption that the covariance computed is right. In your case, if VO thinks the robot is going left and Lidar odometry thinks the robot is going right, with both high confidence, which one should robot_localization trust more? There are odometry approaches that provide a more tightly-coupled fusion with Lidar and vision contributing to same odometry estimation. You can look on this page for approaches with "Visual 3D LiDAR" type.

For your specific questions:

  1. In ros1, I had an experimental rgbdicp_odometry node that could use Reg/Strategy=2, though it has never been ported to ros2. This was however more a lidar odometry with vision helping (it it could, so vision could get lost) for the motion prediction (following this logic where the child is the ICP registration) between scans. At the end, if ICP failed, odometry was lost. Same logic happens if you use Reg/Strategy=2 with rtabmap node for loop closures.
  2. In that example, the only source of odometry was the wheel odometry. The RGB-D and lidar data were used only for loop closure detection and occupancy grid generation. And yes, the lidar can also be used for proximity detection (when moving in opposite direction of the camera).
  3. Yes. I think for a good fusion you should fuse only the twists of both odometries or use differential option in robot_localization (that is something that is recommended in their doc if I remember well). However, like in my introduction, if both VO and lidar odometries send velocities in opposite directions, the EKF may oscillate. So make sure at least the odometries produced from both approaches are relative to same base frame.
  4. Yes, similarly to 3.

For a bidepal robot, if the robot doesn't have a kinematic model that can provide some odometry when it is walking, I would start with a vision-only or lidar-only, maybe combined with IMU, to get a first solution working. These approaches are generally easier to configure and to calibrate because they rely only on one sensor (+IMU). Combining camera and lidar in same odometry estimation requires a more careful calibration and time synchronization of the sensors.

There is a quadruped example with VO here: https://github.com/introlab/rtabmap_ros/tree/ros2/rtabmap_demos#champ-quadruped-nav2-elevation-map-and-vslam

I don't have a Lidar example for quadruped robot, but it would be similar to the husky one: https://github.com/introlab/rtabmap_ros/tree/ros2/rtabmap_demos#clearpath-husky-nav2-3d-lidar-and-rgb-d-slam

matlabbe avatar May 17 '25 21:05 matlabbe