rtabmap_ros icon indicating copy to clipboard operation
rtabmap_ros copied to clipboard

Odometry easily lost

Open MrOCW opened this issue 4 years ago • 10 comments

HI @matlabbe , I am using a D435i with Jetson to run RTABMap on ROS with RViz. I find that odometry is easily lost even though the view is not featureless.

I get lots of such warnings whenever the camera faces a new area that hasnt been mapped.

OdometryF2F.cpp:294::computeTransform() Registration failed: "Not enough inliers 0/20 (matches=12) between 0 and 0"

I get these errors every now and then as well

odometry: Could not get transform from odom to camera_link (stamp=1617684985.570663) after 0.200000 seconds ("wait_for_transform_duration"=0.200000)! Error="Lookup would require extrapolation into the future. Requested time 1617684985.570663214 but the latest data is at time 1617684984.665417194, when looking up transform from frame [camera_link] to frame [odom]. canTransform returned after 0.201946 timeout was 0.2."

Rtabmap.cpp:1148::process() RGB-D SLAM mode is enabled, memory is incremental but no odometry is provided. Image 14831 is ignored!

MrOCW avatar Apr 06 '21 05:04 MrOCW

Some events that can cause odometry lost: https://github.com/introlab/rtabmap/wiki/Kinect-mapping#lostodometry

What is the computation time of odometry when it is tracking? You may also want to do visual odometry with IR cameras instead of the RGB camera, you will get better field of view and less motion blur, see http://wiki.ros.org/rtabmap_ros/Tutorials/StereoHandHeldMapping#Bring-up_example_with_RealSense_cameras

matlabbe avatar Apr 08 '21 12:04 matlabbe

Some events that can cause odometry lost: https://github.com/introlab/rtabmap/wiki/Kinect-mapping#lostodometry

What is the computation time of odometry when it is tracking? You may also want to do visual odometry with IR cameras instead of the RGB camera, you will get better field of view and less motion blur, see http://wiki.ros.org/rtabmap_ros/Tutorials/StereoHandHeldMapping#Bring-up_example_with_RealSense_cameras

@matlabbe how do i check the computation time of odometry? Is it update time? update time is longer than usual when the warning appears

Also, does the example you linked still use the depth camera for the D435i?

[ INFO] [1617950348.595065278]: Odom: quality=197, std dev=0.002557m|0.056744rad, update time=0.135430s

[ WARN] (2021-04-09 14:39:08.801) OdometryF2F.cpp:141::computeTransform() Failed to find a transformation with the provided guess (xyz=-0.007323,0.006732,-0.000843 rpy=-0.002262,-0.000847,0.051729), trying again without a guess.
[ WARN] (2021-04-09 14:39:08.964) OdometryF2F.cpp:171::computeTransform() Trial with no guess succeeded.

[ INFO] [1617950348.967375479]: Odom: quality=207, std dev=0.001881m|0.050171rad, update time=0.351689s

MrOCW avatar Apr 09 '21 06:04 MrOCW

Hi,

yes, the update time on log begging with "Odom:" is the odometry update time. The stereo example uses IR cameras of D435i as a stereo camera, the depth is not used. It could be possible to use IR stereo topics for stereo_odometry node, but use RGB-D topics for rtabmap node by launching nodes separately.

cheers, Mathieu

matlabbe avatar May 02 '21 23:05 matlabbe

Hi,

Any update on this issue? I have the same issue where the map, the occupancy gird, (/map topic) gets recreated all the time. Under re-created, I mean it disappears and a new map, with different dimensions (usually smaller), gets created. As a warning message on the command line I get:

OdometryF2M.cpp:525::computeTransform() Registration failed: "Not enough inliers 13/20 (matches=118) between -1 and 384"

When the camera is steady, the map does not get re-created. When the camera moves re-creation happens. The initial thought was that camera moves fast, but even with slow motion, re-creation happens.

I took a look at this link https://github.com/introlab/rtabmap/wiki/Kinect-mapping#lostodometry and implementing one of the solutions mentioned there are not practical, especially if you have a robot navigating autonomously:

  • position the camera back to the position where it had a map and proper odometry, would require additional implementation for the robot to keep the location of the last valid map and odom (which is unnecessary since it is specific for rtabmap).
  • tried resetting the odometry option, but again this would be an additional implementation just for rtabamp.

So, is there a way to prevent the map from re-creating even if we lose the odometry?

Odometry update time for me:

Odom: quality=303, std dev=0.002696m|0.036256rad, update time=0.117297s

Using: ROS2 Foxy on Ubuntu 20.04.

Regards, Nedim

nedo99 avatar Jun 02 '21 14:06 nedo99

What is your sensor setup? If you are using a D435, please don't use the RGB camera for visual odometry, use the left/right IR cameras instead (they have wider field of view and lower motion blur).

For the general issue about visual odometry and autonomous navigation. Even if you get the best visual odometry approach, it is often not reliable enough for indoor robot navigation, because of areas where there are just no visual features to track because of featureless walls or it is too dark. Having Fisheye stereo camera helps a lot for robustness, even more if you use a Visual Inertial Odometry approach. If you are on a robot, consider also using wheel odometry, even combined with a lidar which can be pretty robust.

For the problem of new maps always re-created when odometry resets, set publish_null_when_lost to false for odometry node with Odom/ResetCountdown to 1. Example for ROS1:

<node pkg="rtabmap" type="rgbd_odometry" name="rgbd_odometry">
   <param name="publish_null_when_lost" value="false"/>
   <param name="Odom/ResetCountdown" value="1"/>
</node>

cheers, Mathieu

matlabbe avatar Jun 03 '21 03:06 matlabbe

Hi @matlabbe,

Thank you for your quick and detailed answer. Yes, we are using realsense D435i. I will try with your suggestions, but in the meantime, I have a follow-up question.

Since we have wheel odometry, and transform odom -> base_link present, and the rtabmap launch file frame_id set to base_link, shouldn't mean that we are using in fact odometry coming from the wheels? Or I need to set some explicit parameter in rtabmap? Also, is there a way to know when this happens (lost odometry) via some ROS topic?

Regards, Nedim

nedo99 avatar Jun 04 '21 08:06 nedo99

Hi Nedim,

To know when lost odometry happens, the odometry node will send a null pose in its odom topic (e.g., the pose quaternion will be invalid x=0,y=0,z=0,w=0) and the odom_info topic will have lost parameter set to true.

Not sure which launch file you are using or how you start rtabmap, but rtabmap will use the odometry topic that is linked to. If you are using rtabmap.launch, set visual_odometry to false, then set odom_topic to your wheel odometry topic. https://github.com/introlab/rtabmap_ros/blob/f7beb23061654fba970fc8770e37425d43967491/launch/rtabmap.launch#L96-L98

If you are using nodes directly, see this example: http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot#Kinect_.2B-_Odometry, replacing kinect by realsense2_camera

cheers, Mathieu

matlabbe avatar Jun 04 '21 14:06 matlabbe

Hi Mathieu,

Thank you for the input. For me, the map is not being re-created and behavior is much better.

If I have more questions, will post them, but for now, I am good!

Regards, Nedim

nedo99 avatar Jun 07 '21 07:06 nedo99

Hi @matlabbe , IMU offers orientation data of the camera, can IMU data improve the robustness of odometry? I think IMU cannot publish the whole pose (translation + orientation) and thus the odometry may still get lost due to rapid move, is that correct?

l-j-h-666 avatar May 28 '25 13:05 l-j-h-666

Yes, though in current implementation we require a minimum of tracked features between frames to accept the update. What you are looking for is a tightly coupled visual inertial odometry approach. In that case, even if there are no features, it will estimate the orientation with IMU alone. Note that with the accelerometer, it is possible to estimate some translation, see for example "IMU dead reckoning" approaches (e.g., https://github.com/mbrossar/ai-imu-dr).

I know that ARCore and ARKit can estimate translation/orientation when visual features cannot be tracked... for max 1-2 sec... but the caveat is that the translation estimation is very poor and drift a lot more. That is causing some weird errors in the final map.

matlabbe avatar May 28 '25 17:05 matlabbe