webots_ros2
webots_ros2 copied to clipboard
Get from Webots a PointCloud2 measure instead of computing it for the RangeFinder
Is your feature request related to a problem? Please describe.
I see that the range_finder_device.py file creates a publisher for sensor_msgs/Image, but I need a sensor_msgs/PointCloud2 instead.
Describe the solution you'd like
I'd like to be able to get a sensor_msgs/msg/PointCloud2 message from a RangeFinder.
Describe alternatives you've considered
- Modify the
RangeFinderDeviceto convert the depth data obtained fromself._wb_device.getRangeImage()to a Pointcloud usingpyrealsense2. Something like this:
import pyrealsense2 as rs
msg = PointCloud2()
msg.header.stamp = stamp
msg.header.frame_id = self._frame_id
msg.height = self._wb_device.getHeight()
msg.width = self._wb_device.getWidth()
# https://medium.com/@yasuhirachiba/converting-2d-image-coordinates-to-3d-coordinates-using-ros-intel-realsense-d435-kinect-88621e8e733a
# https://github.com/IntelRealSense/librealsense/wiki/Projection-in-RealSense-SDK-2.0
# https://github.com/IntelRealSense/librealsense/blob/5e73f7bb906a3cbec8ae43e888f182cc56c18692/include/librealsense2/rsutil.h#L46
_intrinsics = rs.intrinsics()
_intrinsics.width = self._wb_device.getWidth()
_intrinsics.height = self._wb_device.getHeight()
_intrinsics.ppx = _intrinsics.width / 2
_intrinsics.ppy = _intrinsics.height / 2
_intrinsics.fx = ???
_intrinsics.fy = ???
_intrinsics.model = None or something since I don't want to yet use the Webots Lens device
_intrinsics.model = None or something since I don't want to yet use the Webots Lens device
# For loop or map or something to deproject all pixels like this:
deprojected_px = rs.rs2_deproject_pixel_to_point(_intrinsics, [x, y], depth)
However, I fear this approach may be slow. I noticed the comment that you have in that same file which says:
# We pass `data` directly to we avoid using `data` setter.
# Otherwise ROS2 converts data to `array.array` which slows down the simulation as it copies memory internally.
# Both, `bytearray` and `array.array`, implement Python buffer protocol, so we should not see unpredictable
# behavior.
# deepcode ignore W0212: Avoid conversion from `bytearray` to `array.array`.
msg._data = image_array.tobytes()
So if you were worried by a data format conversion, I guess a full blown calculation to deproject a whole image may slow down the simulation even more.
- Similar as above, but instead of relying on
pyrealsense2, implement in Python thers2_deproject_pixel_to_pointfunction - Instead of doing the deprojection calculations in a Python
range_finder_device.pyimplementation, do it in C++, however I don't know how that could be connected to Webots. - @lukicdarkoo mentioned here that such PointCloud generation could be better done in the Webots core.
Additional context Started in the Webots Discord technical-questions channel
I have a question regarding the way 1) of doing this, which I'm going to try since I need to get this working even naively soon.
-
The deprojection function
rs2_deproject_pixel_to_pointrequires the focal lengthfxandfy, "as a multiple of pixel width and height", according to this. -
Same as above, I need the
ppxandppyfields which "describe the pixel coordinates of the principal point (center of projection)". Which I guess is could easily be considered the center of the image.
Having explained that, what could I use to fill what I'm missing?
_intrinsics.ppx = self._wb_device.getWidth().width / 2
_intrinsics.ppy = self._wb_device.getHeight().height / 2
_intrinsics.fx = ???
_intrinsics.fy = ???
We will probably deprecate the Python interface, so the implementation should be done in: https://github.com/cyberbotics/webots_ros2/blob/master/webots_ros2_driver/src/plugins/static/Ros2RangeFinder.cpp
3. @lukicdarkoo mentioned here that such PointCloud generation could be better done in the Webots core.
If we implement it in Webots core then it will be widely available. In addition, we should accelerate it with OpenMP (as well as https://github.com/cyberbotics/webots/issues/3046)
@ManuelZ, I have just extracted a snipped from our private project that may help you:
import math
import numpy as np
# RangeFinder device
range_finder = None
def get_point_cloud():
k_matrix = get_calibration_matrix()
depth_image = get_depth_image()
inv_fx = 1.0 / k_matrix[0, 0]
inv_fy = 1.0 / k_matrix[1, 1]
ox = k_matrix[0, 2]
oy = k_matrix[1, 2]
image_height, image_width = depth_image.shape
points = np.zeros((image_width * image_height, 3), dtype=np.float32)
counter = 0
for y in range(image_height):
for x in range(image_width):
dist = depth_image[y, x]
# This follows the coordinate system of the gripper
points[counter, 1] = -np.float32((x - ox) * dist * inv_fx)
points[counter, 2] = -np.float32((y - oy) * dist * inv_fy)
points[counter, 0] = np.float32(dist)
counter += 1
return points[:counter].astype(np.float32)
def get_calibration_matrix():
image_width = range_finder.getWidth()
image_height = range_finder.getHeight()
focal_length = 0.5 * image_width * (1 / math.tan(0.5 * range_finder.getFov()))
k_matrix = np.array([
[focal_length, 0, image_width / 2],
[0, focal_length, image_height / 2],
[0, 0, 0]
])
return k_matrix
def get_depth_image():
depth_image = np.asarray(range_finder.getRangeImage(), dtype=np.float32)
return depth_image.reshape((-1, range_finder.getWidth()))
Thank you very much @lukicdarkoo! Greatly appreciated.
It worked and my simulation slowed down from > 0.95x to around 0.20x when I subscribe to the topic. I see an obvious increase in the processor usage of the Python and webots-bin processes. I guess that the Python conversion to get a PointCloud2 is slowing everything down.
I see that in this file that you mention: https://github.com/cyberbotics/webots_ros2/blob/master/webots_ros2_driver/src/plugins/static/Ros2RangeFinder.cpp I would be able to achieve the same.
However I see that such commit is very recent. Does it depend on modifications to Webots itself? O could I try compiling it? If so, how could I start?
It worked and my simulation slowed down from > 0.95x to around 0.20x when I subscribe to the topic. I see an obvious increase in the processor usage of the Python and
webots-binprocesses. I guess that the Python conversion to get a PointCloud2 is slowing everything down.
There are two loops in the get_point_cloud function. You can try to get rid of them and let numpy deal with calculations.
However I see that such commit is very recent. Does it depend on modifications to Webots itself? O could I try compiling it? If so, how could I start?
For the compilation you don't need Webots at all, but when running use R2021b to be on the safe side (I used R2021b for development).
Thank you for the suggestion of vectorizing the operation with Numpy. It got me to to somewhere around 0.42x. Surprisingly enough, when I leave it running for one minute and come back to it, I then see it around 0.60x.
I'll try now modfying the Ros2RangeFinder.cpp file.
For anyone interested (beware with the coordinate system) this replaces the for loop:
u, v = np.indices((image_height, image_width), dtype=np.int32)
x = np.float32((u - ox) * depth_image * inv_fx)
y = np.float32((v - oy) * depth_image * inv_fy)
points = np.zeros((image_width * image_height, 3), dtype=np.float32)
points[:,0] = x.reshape(-1)
points[:,1] = y.reshape(-1)
points[:,2] = depth_image.reshape(-1)
return points
Sorry to comment here. I have trouble writing my C++ version of range-finder plugin to broadcast point cloud. Could you have a look at it?
depth_camera.hpp
#ifndef DIFFDRIVE_WEBOTS_DEPTH_CAMERA_HPP
#define DIFFDRIVE_WEBOTS_DEPTH_CAMERA_HPP
#include <webots/Supervisor.hpp>
#include <webots/RangeFinder.hpp>
#include <sensor_msgs/msg/point_cloud2.hpp>
#include <webots_ros2_driver/PluginInterface.hpp>
#include <webots_ros2_driver/WebotsNode.hpp>
namespace diffdrive_webots_plugin
{
class DepthCamera : public webots_ros2_driver::PluginInterface
{
public:
void step() override;
void init(webots_ros2_driver::WebotsNode *node, std::unordered_map<std::string, std::string> ¶meters) override;
private:
webots_ros2_driver::WebotsNode *node_;
rclcpp::Publisher<sensor_msgs::msg::PointCloud2>::SharedPtr point_cloud_pub_;
sensor_msgs::msg::PointCloud2 point_cloud_;
webots::Supervisor *robot_;
webots::RangeFinder *depth_camera_;
int depth_camera_period_;
float cx_;
float cy_;
float fx_;
float fy_;
};
}
#endif
depth_camera.cpp
#include "diffdrive_webots/depth_camera.hpp"
using std::placeholders::_1;
namespace diffdrive_webots_plugin
{
void DepthCamera::init(webots_ros2_driver::WebotsNode *node, std::unordered_map<std::string, std::string> ¶meters)
{
// initialize member variables
node_ = node;
robot_ = node->robot();
depth_camera_ = NULL;
cx_ = 0;
cy_ = 0;
fx_ = 0;
fy_ = 0;
// retrieve tags
if (parameters.count("depthCameraPeriodMs"))
depth_camera_period_ = std::stoi(parameters["depthCameraPeriodMs"]);
else
throw std::runtime_error("Must set depthCameraPeriodMs tag");
std::string depth_camera_name;
if (parameters.count("depthCameraName"))
depth_camera_name = parameters["depthCameraName"];
else
throw std::runtime_error("Must set depthCameraName tag");
std::string frame_id;
if (parameters.count("frameID"))
frame_id = parameters["frameID"];
else
throw std::runtime_error("Must set frameID tag");
// set webots device
depth_camera_ = robot_->getRangeFinder(depth_camera_name);
if (depth_camera_ == NULL)
throw std::runtime_error("Cannot find DepthCamera with name " + depth_camera_name);
int timestep = (int)robot_->getBasicTimeStep();
if (depth_camera_period_ % timestep != 0)
throw std::runtime_error("depthCameraPeriodMs must be integer multiple of basicTimeStep");
depth_camera_->enable(depth_camera_period_);
int width = depth_camera_->getWidth();
int height = depth_camera_->getHeight();
float hfov = depth_camera_->getFov();
cx_ = width / 2.0;
cy_ = height / 2.0;
fx_ = 0.5 * width * (1 / tan(0.5 * hfov));
fy_ = fx_;
point_cloud_.header.frame_id = frame_id;
point_cloud_.width = width;
point_cloud_.height = height;
point_cloud_.fields.resize(3);
point_cloud_.fields[0].name = "x";
point_cloud_.fields[0].datatype = sensor_msgs::msg::PointField::FLOAT32;
point_cloud_.fields[0].count = 1;
point_cloud_.fields[0].offset = 0;
point_cloud_.fields[1].name = "y";
point_cloud_.fields[1].datatype = sensor_msgs::msg::PointField::FLOAT32;
point_cloud_.fields[1].count = 1;
point_cloud_.fields[1].offset = 4;
point_cloud_.fields[2].name = "z";
point_cloud_.fields[2].datatype = sensor_msgs::msg::PointField::FLOAT32;
point_cloud_.fields[2].count = 1;
point_cloud_.fields[2].offset = 8;
point_cloud_.is_bigendian = false;
point_cloud_.point_step = 3 * sizeof(float);
point_cloud_.row_step = width * 3 * sizeof(float);
point_cloud_.data.resize(width * height * 3 * sizeof(float));
point_cloud_.is_dense = false;
point_cloud_pub_ = node->create_publisher<sensor_msgs::msg::PointCloud2>("point_cloud", rclcpp::SensorDataQoS().reliable());
}
void DepthCamera::step()
{
int64_t sim_time = (int64_t)(robot_->getTime() * 1e3);
if (sim_time % depth_camera_period_ == 0)
{
point_cloud_.header.stamp = node_->get_clock()->now();
const float* range_image = depth_camera_->getRangeImage();
int width = depth_camera_->getWidth();
int height = depth_camera_->getHeight();
for (int j = 0; j < height; j++)
{
for (int i = 0; i < width; i++)
{
int idx = j * height + i;
float x = *(range_image + idx);
float y = -(i - cx_) * x / fx_;
float z = -(j - cy_) * x / fy_;
memcpy(point_cloud_.data.data() + idx * sizeof(float) * 3 , &x, sizeof(float));
memcpy(point_cloud_.data.data() + idx * sizeof(float) * 3 + sizeof(float) , &y, sizeof(float));
memcpy(point_cloud_.data.data() + idx * sizeof(float) * 3 + sizeof(float) * 2, &z, sizeof(float));
}
}
point_cloud_pub_->publish(point_cloud_);
}
}
}
// The class has to be exported with `PLUGINLIB_EXPORT_CLASS` macro.
// The first argument is the name of your class, while the second is always `webots_ros2_driver::PluginInterface`
#include "pluginlib/class_list_macros.hpp"
PLUGINLIB_EXPORT_CLASS(diffdrive_webots_plugin::DepthCamera, webots_ros2_driver::PluginInterface)
This code can compile but the point cloud is not rendered correctly in rviz2. Also, i find that if i run rviz2 and enable PointCloud2 visualization, simulation speed drop a lot. Why subscribe to point cloud topic cause webots to slow down? I am using Webots 2022a, webots_ros2 1.2.2 and ros-foxy.
I did a stress test a while ago with the webots_ros2 package. In the simulation, there were 10 cameras with a resolution of 640x480 publishing at 30Hz, one VLP16 at 30Hz, and two HDL32 at 10Hz. Everything was turned on all the time (using the alwaysOn property) and the RTF was around 1.0 on my laptop.
Given that the range finder should have a similar performance to a camera the bottleneck may be in your plugin.
Thank you for your test. I agree that the bottleneck is my plugin but what confuse me is that when rviz didn’t subscribe the point cloud topic simulation was around 1.0x, once subscribed the simulation drop to 0.3x sometimes.
ManuelZ report the same thing.
It worked and my simulation slowed down from > 0.95x to around 0.20x when I subscribe to the topic. I see an obvious increase in the processor usage of the Python and webots-bin processes. I guess that the Python conversion to get a PointCloud2 is slowing everything down.
Why the simulation speed goes down when subscribed? Shouldn’t it be slowed down all the time?
I wrote a simple point cloud subscriber and used it to test simulation speed. It seems that simulation speed didn't went down, so maybe the drop is caused by rviz? My laptop only has iGPU, maybe rviz took too much usage.
I found out that i made a mistake iterating range image.
In function void DepthCamera::step(), int idx = j * height + i; should be int idx = j * width + i;
That is one silly mistake. I will put my code below in case someone need to implement this with C++.
#include "diffdrive_webots/depth_camera.hpp"
#include <sensor_msgs/image_encodings.hpp>
using std::placeholders::_1;
namespace diffdrive_webots_plugin
{
void DepthCamera::init(webots_ros2_driver::WebotsNode *node, std::unordered_map<std::string, std::string> ¶meters)
{
// initialize member variables
node_ = node;
robot_ = node->robot();
depth_camera_ = NULL;
cx_ = 0;
cy_ = 0;
fx_ = 0;
fy_ = 0;
// retrieve tags
if (parameters.count("depthCameraPeriodMs"))
depth_camera_period_ = std::stoi(parameters["depthCameraPeriodMs"]);
else
throw std::runtime_error("Must set depthCameraPeriodMs tag");
std::string depth_camera_name;
if (parameters.count("depthCameraName"))
depth_camera_name = parameters["depthCameraName"];
else
throw std::runtime_error("Must set depthCameraName tag");
std::string frame_id;
if (parameters.count("frameID"))
frame_id = parameters["frameID"];
else
throw std::runtime_error("Must set frameID tag");
// set webots device
depth_camera_ = robot_->getRangeFinder(depth_camera_name);
if (depth_camera_ == NULL)
throw std::runtime_error("Cannot find DepthCamera with name " + depth_camera_name);
int timestep = (int)robot_->getBasicTimeStep();
if (depth_camera_period_ % timestep != 0)
throw std::runtime_error("depthCameraPeriodMs must be integer multiple of basicTimeStep");
depth_camera_->enable(depth_camera_period_);
int width = depth_camera_->getWidth();
int height = depth_camera_->getHeight();
float hfov = depth_camera_->getFov();
cx_ = width / 2.0;
cy_ = height / 2.0;
fx_ = 0.5 * width * (1 / tan(0.5 * hfov));
fy_ = fx_;
point_cloud_.header.frame_id = frame_id;
point_cloud_.width = width;
point_cloud_.height = height;
point_cloud_.fields.resize(3);
point_cloud_.fields[0].name = "x";
point_cloud_.fields[0].datatype = sensor_msgs::msg::PointField::FLOAT32;
point_cloud_.fields[0].count = 1;
point_cloud_.fields[0].offset = 0;
point_cloud_.fields[1].name = "y";
point_cloud_.fields[1].datatype = sensor_msgs::msg::PointField::FLOAT32;
point_cloud_.fields[1].count = 1;
point_cloud_.fields[1].offset = 4;
point_cloud_.fields[2].name = "z";
point_cloud_.fields[2].datatype = sensor_msgs::msg::PointField::FLOAT32;
point_cloud_.fields[2].count = 1;
point_cloud_.fields[2].offset = 8;
point_cloud_.is_bigendian = false;
point_cloud_.point_step = 3 * sizeof(float);
point_cloud_.row_step = width * 3 * sizeof(float);
point_cloud_.data.resize(width * height * 3 * sizeof(float));
point_cloud_.is_dense = false;
point_cloud_pub_ = node->create_publisher<sensor_msgs::msg::PointCloud2>("point_cloud", rclcpp::SensorDataQoS().reliable());
}
void DepthCamera::step()
{
int64_t sim_time = (int64_t)(robot_->getTime() * 1e3);
if (sim_time % depth_camera_period_ == 0)
{
point_cloud_.header.stamp = node_->get_clock()->now();
int width = depth_camera_->getWidth();
int height = depth_camera_->getHeight();
const float* range_image = depth_camera_->getRangeImage();
float* data = (float*)point_cloud_.data.data();
for (int j = 0; j < height; j++)
{
for (int i = 0; i < width; i++)
{
int idx = j * width + i;
float x = range_image[idx];
float y = -(i - cx_) * x / fx_;
float z = -(j - cy_) * x / fy_;
memcpy(data + idx * 3 , &x, sizeof(float));
memcpy(data + idx * 3 + 1, &y, sizeof(float));
memcpy(data + idx * 3 + 2, &z, sizeof(float));
}
}
point_cloud_pub_->publish(point_cloud_);
}
}
}
// The class has to be exported with `PLUGINLIB_EXPORT_CLASS` macro.
// The first argument is the name of your class, while the second is always `webots_ros2_driver::PluginInterface`
#include "pluginlib/class_list_macros.hpp"
PLUGINLIB_EXPORT_CLASS(diffdrive_webots_plugin::DepthCamera, webots_ros2_driver::PluginInterface)
I wrote a simple point cloud subscriber and used it to test simulation speed. It seems that simulation speed didn't went down, so maybe the drop is caused by rviz? My laptop only has iGPU, maybe rviz took too much usage.
What you can try is to subscribe to your topic from terminal instead from RViz.
@TaoYibo1866 Maybe it would be useful to integrate the code into the RangeFinder static plugin: https://github.com/cyberbotics/webots_ros2/blob/master/webots_ros2_driver/src/plugins/static/Ros2RangeFinder.cpp
@TaoYibo1866 Maybe it would be useful to integrate the code into the RangeFinder static plugin: https://github.com/cyberbotics/webots_ros2/blob/master/webots_ros2_driver/src/plugins/static/Ros2RangeFinder.cpp
Thank you, I will try to modify it. However, since you mentioned that there will be getPointCloud() for range-finder, I guess it is better to integrate this at that point.
After #389 I rename this issue as RangeFinder publishes now PointCloud2 messages but we still need a function in Webots to return this data instead of computing them ourself.