rtabmap icon indicating copy to clipboard operation
rtabmap copied to clipboard

Extending RTAB-Map with additional sensor data

Open kurtgeebelen opened this issue 4 years ago • 2 comments

Hi Mathieu,

Not really an issue, but I didn't know where else to post it. We want to extend the graph-slam algorithm in RTAB-Map with additional sensor data, such as data from Ultra-Wideband measurements (that measure the distance from the robot to a fixed anchor point in the environment). The position of the anchor may or may not be known (if not known, it's 3D position would have to be estimated as well).

We would do the same for other types of measurements, such as Aruco markers (which I've seen you support to some extend already, but could e.g. be extended with prior knowledge if you have calibrated the 3D position of the markers), WiFi-signal strength localization (which would give a similar, but less accurate, range-type measurement to the wireless access point's location as Ultra-Wideband measurements).

Our hope is to improve the robustness of the mapping & localization. Please let me know if you think something like this could be done (from reading your papers, I assume it can), and if possible, some hint on where to start.

Cheers, Kurt

kurtgeebelen avatar Aug 23 '21 13:08 kurtgeebelen

Hi Kurt,

See my answer on this thread for the case we would know the position of the tags or anchors: http://official-rtab-map-forum.206.s1.nabble.com/How-to-improve-mapping-accuracy-based-on-ArUco-identification-code-td8393.html#a8497

Anchors could be treated like markers. You may use the tag_detections input topic of rtabmap node to feed their detection. We assume that each anchor is uniquely identifiable. I am not used to Ultra-Wideband measurements, but is it only a 3D pose that can be computed (without orientation of the anchor)? See this post about that: http://official-rtab-map-forum.206.s1.nabble.com/Using-RTAB-Map-for-graph-based-mapping-tp8431p8514.html

cheers, Mathieu

matlabbe avatar Sep 03 '21 19:09 matlabbe

Hi Mathieu,

Thanks for the answer. The first link is indeed what we would like to do for the Aruco markers. One remark is that it seems like the 'measurement' towards RTAB-map is always the relative position between the camera and the aruco marker. My preferred solution would be to add as the measurement the pixel-positions of the corners of the aruco marker, and let the graph-solver deal with the camera model. Not sure if implementing a custom measurement function would be possible?

For UWB, basically we get a distance measurement between a beacon in the environment and the vehicle we're tracking. The position of the beacon in the environment may be known or not, in which case we would have to estimate it. In the situation where you have multiple beacons in the environment, you could triangulate your own position, but the point of what we want to do is to have only a few beacons placed in the environment to 'robustify' a (visual) SLAM approach.

The forum was offline when I asked the original question, and I've gotten some more information from it now, e.g. also this one: http://official-rtab-map-forum.206.s1.nabble.com/Detected-Object-Location-Mark-on-RTABMAP-td4571.html#a4594 Here you do mention you can add userData to RTAB-Map, but I guess this is only for mapping, and not for helping in the localization? (For that example, the wifi-signal strenght could also be seen as an (indirect) measurement of the position)

Thanks, Cheers, Kurt

kurtgeebelen avatar Sep 27 '21 12:09 kurtgeebelen