realsense-ros
realsense-ros copied to clipboard
T265 with wheel odometry guide and questions about grey areas
Hi,
Probably other people asked that question before so apologies for repeating it. I think it would be good to have a guide or a bit documentation or even sample code on how to use T265 with wheel odometry. I have seen some useful information on github and also had a look inside the code. But it is a bit funny that are very little info even though it is recommended to use T265 with wheel odometry. (On the side note tried without and I can see why it is recommended and why only linear vel are needed)
What I figured out that has to be done is to create calibration_odometry.json with transform required, however have not work out how to calculate noise_variance, any help on that appreciated.
Also you need to publish /odom_in topic of nav_msgs/Odometry.msg type and only linear vel are used. Do you have to enable it (with launch files or param) or does T265 automatically uses if there's required topic. If my robot only moves in x linear (and ang z), do I keep other values (lin y, z) as 0, some sort of nulls? Also what about ang part of odom, as far as I could find it is not used. Is covariance used by T265? I also assume that pose part of odom is not required.
Also, is there any requirements to encoder resolution/accuracy or pub frequency?
Any quirks with using wheel odom with T265 in ROS rather than SDK?
If there are additional points needed to make it work, that I missed please let me know
As always thanks for all the help in advance
M
I thought I'd make an update.
I created calibration_odometry.json with all translations and rotation component set to zero and node to pass /odom_in topic. I added those two parameters to rs_t265.launch default file.
<arg name="topic_odom_in" value="$(arg topic_odom_in)"/>
<arg name="calib_odom_file" value="$(arg calib_odom_file)"/>
I checked both are successfully passed to t265 node, using that function:
void T265RealsenseNode::odom_in_callback(const nav_msgs::Odometry::ConstPtr& msg)
However when testing it with fake constant velocity (first test all vector component are zero; another test just one component set to constant, rest are zero) and covering T265 camera so it only relies on IMU, I have not achieved expected results with and without additional odom topic. I was expecting camera to be static in first test (which it was not) and slight drift in second test towards set axis.
Anyone would like to comment on that?
We understand the need for documentation and we have some documentation located here: https://github.com/IntelRealSense/librealsense/blob/master/doc/t265.md#t265-tracking-camera https://dev.intelrealsense.com/docs/tracking-camera-t265 https://github.com/IntelRealSense/librealsense/pull/3462 https://github.com/IntelRealSense/librealsense/blob/master/unit-tests/resources/calibration_odometry.json https://github.com/IntelRealSense/realsense-ros https://github.com/IntelRealSense/realsense-ros#using-t265
The T265 system works better when the cameras are not covered and allowing the SLAM algo to "see" features in the scene to better orient itself.
@RealSenseSupport Hi, can we pass angular velocity from wheel odom to T265 as well or only linear velocity
v = rs2.vector() v.x = linear_v v.y = 0 v.z = 0
?
Hi @bot-lin, from my experiments you can pass both linear and angular but only linear will be used for fusion algorithm
Hi @bot-lin, from my experiments you can pass both linear and angular but only linear will be used for fusion algorithm
Got it. Thanks