rtabmap_ros icon indicating copy to clipboard operation
rtabmap_ros copied to clipboard

Not able to get any visual odometry on Rtabmap_viz

Open jbodhey opened this issue 1 year ago • 7 comments

I am using ubuntu 22 (ros2 humble) with intel realsense d435i . I want to get odometry path (trajectory)of the camera. I tried the steps mentioned on the github but I am not able to run any of the examples from the folder. for eg when i try to run examples "realsensed435i_color_launch.py" code from rtabmap_examples folder I am not able to see anything on rtabmap_viz window. I use these commands:

# ros2 launch realsense2_camera rs_launch.py enable_gyro:=true enable_accel:=true unite_imu_method:=1 enable_sync:=true
#   $ ros2 launch rtabmap_examples realsense_d435i_color.launch.py

but i get this screen: image Screenshot from 2024-10-15 23-36-12

see the terminal error:

[rtabmap_viz-3] [WARN] [1729028214.191427533] [rtabmap_viz]: rtabmap_viz: Did not receive data since 5 seconds! Make sure the input topics are published ("$ ros2 topic hz my_topic") and the timestamps in their header are set. If topics are coming from different computers, make sure the clocks of the computers are synchronized ("ntpdate"). Parameter "approx_sync" is false, which means that input topics should have all the exact timestamp for the callback to be called.
[rtabmap_viz-3] rtabmap_viz subscribed to (exact sync):
[rtabmap_viz-3]    /odom \
[rtabmap_viz-3]    /camera/color/image_raw \
[rtabmap_viz-3]    /camera/realigned_depth_to_color/image_raw \
[rtabmap_viz-3]    /camera/color/camera_info \
[rtabmap_viz-3]    /odom_info
[rtabmap-2] [WARN] [1729028216.136727730] [rtabmap]: rtabmap: Did not receive data since 5 seconds! Make sure the input topics are published ("$ ros2 topic hz my_topic") and the timestamps in their header are set. If topics are coming from different computers, make sure the clocks of the computers are synchronized ("ntpdate"). Parameter "approx_sync" is false, which means that input topics should have all the exact timestamp for the callback to be called.
[rtabmap-2] rtabmap subscribed to (exact sync):
[rtabmap-2]    /odom \
[rtabmap-2]    /camera/color/image_raw \
[rtabmap-2]    /camera/realigned_depth_to_color/image_raw \
[rtabmap-2]    /camera/color/camera_info \
[rtabmap-2]    /odom_info

when i run ros2 topic list:

jeet@jeet:~/ros2_ws/src/realsense-ros/realsense2_camera/launch$ ros2 topic list

/camera/aligned_depth_to_color/image_raw
/camera/camera/accel/imu_info
/camera/camera/accel/metadata
/camera/camera/accel/sample
/camera/camera/color/camera_info
/camera/camera/color/image_raw
/camera/camera/color/metadata
/camera/camera/depth/camera_info
/camera/camera/depth/image_rect_raw
/camera/camera/depth/metadata
/camera/camera/extrinsics/depth_to_accel
/camera/camera/extrinsics/depth_to_color
/camera/camera/extrinsics/depth_to_gyro
/camera/camera/gyro/imu_info
/camera/camera/gyro/metadata
/camera/camera/gyro/sample
/camera/camera/imu
/camera/color/camera_info
/camera/color/image_raw
/diagnostics
/goal_pose
/gps/fix
/parameter_events
/rosout
/rtabmap/cloud_ground
/rtabmap/cloud_map
/rtabmap/cloud_obstacles
/rtabmap/elevation_map
/rtabmap/global_path
/rtabmap/global_path_nodes
/rtabmap/global_pose
/rtabmap/goal
/rtabmap/goal_node
/rtabmap/goal_reached
/rtabmap/grid_prob_map
/rtabmap/imu
/rtabmap/info
/rtabmap/initialpose
/rtabmap/labels
/rtabmap/landmark_detection
/rtabmap/landmark_detections
/rtabmap/landmarks
/rtabmap/local_grid_empty
/rtabmap/local_grid_ground
/rtabmap/local_grid_obstacle
/rtabmap/local_path
/rtabmap/local_path_nodes
/rtabmap/localization_pose
/rtabmap/map
/rtabmap/mapData
/rtabmap/mapGraph
/rtabmap/mapOdomCache
/rtabmap/mapPath
/rtabmap/octomap_binary
/rtabmap/octomap_empty_space
/rtabmap/octomap_full
/rtabmap/octomap_global_frontier_space
/rtabmap/octomap_grid
/rtabmap/octomap_ground
/rtabmap/octomap_obstacles
/rtabmap/octomap_occupied_space
/rtabmap/odom
/rtabmap/odom_info
/rtabmap/odom_info_lite
/rtabmap/odom_last_frame
/rtabmap/odom_local_map
/rtabmap/odom_local_scan_map
/rtabmap/odom_rgbd_image
/rtabmap/odom_sensor_data/compressed
/rtabmap/odom_sensor_data/features
/rtabmap/odom_sensor_data/raw
/rtabmap/rtabmap/republish_node_data
/tf
/tf_static
/user_data_async

i feel the issue is because of camera/camera instead of camera/

please let me know why i am not able to see anything on rtabmap_viz, also how can i proceed to get odometry path of d435i camera

jbodhey avatar Oct 15 '24 21:10 jbodhey

Can you try with the latest version of that launch file: https://github.com/introlab/rtabmap_ros/blob/ros2/rtabmap_examples/launch/realsense_d435i_color.launch.py ? (download the file and do ros2 launch realsense_d435i_color.launch.py directly)

matlabbe avatar Oct 16 '24 17:10 matlabbe

Instead of color.py I tried stereo and infra.py they work better in my case. How can I check the distance travelled by the camera and compare the accuracy with the real distance

jbodhey avatar Oct 20 '24 20:10 jbodhey

@jbodhey see https://github.com/introlab/rtabmap_ros/issues/1221#issuecomment-2439676686

matlabbe avatar Oct 26 '24 18:10 matlabbe

Thanks, I can see the distance travelled now.

I have two queries:

  1. It updates images at slow rate (1 image per second). so how can I increase the frame rate.
  2. Where can I increase the number of features captured in the image. It should capture more features in the scene.

image

Current code:

import os

from ament_index_python.packages import get_package_share_directory

from launch import LaunchDescription
from launch_ros.actions import Node, SetParameter
from launch.actions import IncludeLaunchDescription
from launch.launch_description_sources import PythonLaunchDescriptionSource

def generate_launch_description():
    parameters=[{
          'frame_id':'camera_link',
          'subscribe_depth':True,
          'subscribe_odom_info':True,
          'approx_sync':False,
          'wait_imu_to_init':True}]

    remappings=[
          ('imu', '/imu/data'),
          ('rgb/image', '/camera/camera/infra1/image_rect_raw'),
          ('rgb/camera_info', '/camera/camera/infra1/camera_info'),
          ('depth/image', '/camera/camera/depth/image_rect_raw')]

    return LaunchDescription([

        #Hack to disable IR emitter
        SetParameter(name='depth_module.emitter_enabled', value=0),

        # Launch camera driver
        IncludeLaunchDescription(
            PythonLaunchDescriptionSource([os.path.join(
                get_package_share_directory('realsense2_camera'), 'launch'),
                '/rs_launch.py']),
                launch_arguments={'enable_gyro': 'true',
                                  'enable_accel': 'true',
                                  'unite_imu_method': '1',
                                  'enable_infra1': 'true',
                                  'enable_infra2': 'true',
                                  'enable_sync': 'true'}.items(),
        ),

        Node(
            package='rtabmap_odom', executable='rgbd_odometry', output='screen',
            parameters=parameters,
            remappings=remappings),

        Node(
            package='rtabmap_slam', executable='rtabmap', output='screen',
            parameters=parameters,
            remappings=remappings,
            arguments=['-d']),

        Node(
            package='rtabmap_viz', executable='rtabmap_viz', output='screen',
            parameters=parameters,
            remappings=remappings),
                
        # Compute quaternion of the IMU
        Node(
            package='imu_filter_madgwick', executable='imu_filter_madgwick_node', output='screen',
            parameters=[{'use_mag': False, 
                         'world_frame':'enu', 
                         'publish_tf':False}],
            remappings=[('imu/data_raw', '/camera/camera/imu')]),
        
        # The IMU frame is missing in TF tree, add it:
        Node(
            package='tf2_ros', executable='static_transform_publisher', output='screen',
            arguments=['0', '0', '0', '0', '0', '0', 'camera_gyro_optical_frame', 'camera_imu_optical_frame']),
    ])

jbodhey avatar Nov 06 '24 21:11 jbodhey

You can add 'Rtabmap/DetectionRate': '2' to parameters to increase rate at which the map is updated (2 Hz for example).

There should be a maximum of 1000 features captured in the images. If you want more, add 'Vis/MaxFeatures': '2000' (for example) to parameters.

matlabbe avatar Nov 12 '24 04:11 matlabbe

Yes, I can set the parameters from the rtabmap_viz interface. Is there any way, that I can set this parameter in the code so that it remains fixed every time.

Also, I can see that currently, rtabmap is not using the dedicated graphics of my laptop. I have NVIDIA RTX3050. So how can I make use of that, so that I have better performance?

jbodhey avatar Nov 14 '24 18:11 jbodhey

Example from my previous post:

parameters=[{
          'frame_id':'camera_link',
          'subscribe_depth':True,
          'subscribe_odom_info':True,
          'approx_sync':False,
          'wait_imu_to_init':True,
          'Rtabmap/DetectionRate': '2',
          Vis/MaxFeatures': '2000'}]

It should use your GPU for at least 3D rendering. With nvidia-smi, we can see rtabmap_viz using it: image

For rtabmap/visual odometry nodes, it depends if you are using OpenCV built with CUDA support and that some Gpu parameters are used.

rtabmap --params | grep Gpu
Param: FAST/Gpu = "false"                                  [GPU-FAST: Use GPU version of FAST. This option is enabled only if OpenCV is built with CUDA and GPUs are detected.]
Param: FAST/GpuKeypointsRatio = "0.05"                     [Used with FAST GPU.]
Param: GFTT/Gpu = "false"                                  [GPU-GFTT: Use GPU version of GFTT. This option is enabled only if OpenCV>=3 is built with CUDA and GPUs are detected.]
Param: ORB/Gpu = "false"                                   [GPU-ORB: Use GPU version of ORB. This option is enabled only if OpenCV is built with CUDA and GPUs are detected.]
Param: SIFT/Gpu = "false"                                  [CudaSift: Use GPU version of SIFT. This option is enabled only if RTAB-Map is built with CudaSift dependency and GPUs are detected.]
Param: SURF/GpuKeypointsRatio = "0.01"                     [Used with SURF GPU.]
Param: SURF/GpuVersion = "false"                           [GPU-SURF: Use GPU version of SURF. This option is enabled only if OpenCV is built with CUDA and GPUs are detected.]
Param: Stereo/Gpu = "false"                                [[Stereo/OpticalFlow=true] Enable GPU version of the optical flow approach (only available if OpenCV is built with CUDA).]
Param: Vis/CorFlowGpu = "false"                            [[Vis/CorType=1] Enable GPU version of the optical flow approach (only available if OpenCV is built with CUDA).]

matlabbe avatar Nov 16 '24 23:11 matlabbe