ROS2 Kilted Release, Depthai V3 and OAK4 Support
Hi, We wanted to share some information on ROS2 Kilted release that will be the first release to use new version (V3) of DepthAI library, which itself will be made public in near future. Since this new release introduces breaking changes in terms of DepthAI API, as well as introduces many new features (integration of open souce VIO/VSLAM libraries, host DepthAI nodes, new API for handling NNs and more) it will be only released for Kilted upwards, as we will be making some long overdue changes to DepthAI ROS API as well. These changes will mostly touch on things such as IMU orientation, new API for parameters regarding sensor nodes (ColorCamera and MonoCamera nodes will be deprecated in favor of one common Camera node), changes to pipelines, TF publishing and more. Exact details will be posted in upcoming weeks, in meantime we welcome suggestions 🙂 Release date for this new version is planned for July.
https://github.com/luxonis/depthai-ros/pull/726
First Kilted version is underway, here are some parts that were changed so far:
Changes
- Updating to DepthAI V3
- Switching to Camera nodes instead of Mono/Color cams
- Updated socket/frame naming to reflect different devices and configurations. In
rs_modethey adhere to Realsense naming and for specific devices, such as OAK-T or OAK-D-SR-PoE have names related to actual sensors, i.e.tofinstead ofrgb - IMU publishing now in RDF frame across the board
- Using TFPublisher instead of URDF description by default for more accurate results
- Undistorted streams can now be requested
-
depthai_exampleshave been largely modjified to remove deprecated examples and simplify code. List of current examples:- rgbd_spatial_detections
- feature_tracker
- converter_to_cv
- imu_publisher
- rgb_publisher
- rgbd_publisher
- rgb_subscriber
- rgb_compressed_publisher
- disparity_publisher
- detection_publisher
- tof_publisher
- tof_rgbd
- thermal_publisher
- Tests added for converters in
depthai_bridge - NN creation simplified both in
depthai_examplesanddepthai_ros_driver-
i_nn_familyparameter corresponds to nn_kind (WARNING, this might be renamed in the future) -
i_nn_modelis an actual model that is passed to the NN
-
- RGBD Node and Pointcloud converter have been added
- Thermal node added
- deprecated
camera.launch.pyin favor ofdriver.launch.py - Main node of
depthai_ros_driverhas been renamed fromcameratodriverwhich might impact parameter passing (in your current YAML file, replacecamerawithdriver -
i_mx_idhas been replaced byi_device_id - Parameters related to pipeline creation have been moved to PipelineGen PH:
-
i_pipeline_type -
i_nn_type -
i_enable_rgbd
-
- Pipeline plugin creation method has now signature
virtual std::vector<std::unique_ptr<dai_nodes::BaseNode>> createPipeline(std::shared_ptr<rclcpp::Node> node,
std::shared_ptr<dai::Device> device,
std::shared_ptr<dai::Pipeline> pipeline,
std::shared_ptr<param_handlers::PipelineGenParamHandler> ph,
const std::string& deviceName,
bool rsCompat,
const std::string& nnType) = 0;
- Mono/ColorCamera nodes have been removed in favor of common Camera node. This means that there is no need to specify ISP scaling or to choose which output from camera node is needed. Currently Camera sensor nodes can have a single output, in future additional output options will be added.
- Undistorted stream can be requested by setting
i_undistortedand choosingi_resize_mode - Default size for camera outputs has been updated to 640x400 for improved performance
- RGBD node has been added for publishing RGBD pointclouds with improved performance (no external nodes are needed)
- XLinkOut nodes are no longer necessary
- TF Frames for base_frame and parent frame have been changed to NodeName and
oak_parent_framerespectively
Usage
To use this distribution, you can either wait for an official release (with the release of depthai-core). It is also possible to use testing repsitory as mentioned here. Additional option is to build from source, for that you can refer to Kilted dockerfile
Known issues
- Subscribing to topics in
driverwill most likely result in errors now as correct conversion of different frame types needs to be added - Segmentation network only works for RVC2 devices for now
- For RVC4 devices you can use only COPY as ImuSyncMethod
- Driver does not yet output uncolored pointclouds
- Setting image orientation is not yet supported
- RGBD node doesn't have GPU support yet
- IR driver detection does not work properly on RVC4
Is there any chance that the depthai v3 support will be backported to ros humble?
Hi, unfortunately not since it would break existing projects. You can still try to build it manually from source though
Hi, unfortunately not since it would break existing projects. You can still try to build it manually from source though
I've had some issues building ROS stuff from source in the past due to me using Arch Linux (I ended up using a pixi environment because I could never get it working. (on that note, is there a possibility that the depthai-ros driver could get published to robostack?)). Although building in a pixi environment will probably fix that for me.
But it would be useful to both me, and probably others, to be able to install it without having to build it from source.
Perhaps since it's a breaking change, it could be made into a separate branch? something like a branch named humble-v3 which could then be published to a package named like ros-humble-depthai-v3-ros?
If it's not too much difficulty, then I'm also willing to help with backporting it myself*.
*: assuming that I actually end up going with an OAK camera for my project. I'm not 100% sure on this just yet, however I'm pretty sure I'll be going with an OAK camera.
Hi, sorry, I missed the message.
I've had some issues building ROS stuff from source in the past due to me using Arch Linux (I ended up using a pixi environment because I could never get it working. (on that note, is there a possibility that the depthai-ros driver could get published to robostack?)). Although building in a pixi environment will probably fix that for me.
I would personally recommend switching to Ubuntu/Debian for ROS development as it is the Tier 1 support regarding ROS packages and everything usually works out of the box, using custom package managers can also lead to some clashes with libraries (but that depends of course on use-case). As an alternative you could use Docker to get reproducible build.
But it would be useful to both me, and probably others, to be able to install it without having to build it from source. Perhaps since it's a breaking change, it could be made into a separate branch? something like a branch named humble-v3 which could then be published to a package named like ros-humble-depthai-v3-ros?
While I agree that building from source can be painful, based on internal discussions and general package release policy for ROS (API breaking only for new distros), we will currently support DepthAI V3 only for Kilted release, as having multiple packages for different versions could introduce issues/confusion for users.
I would personally recommend switching to Ubuntu/Debian for ROS development as it is the Tier 1 support regarding ROS packages and everything usually works out of the box, using custom package managers can also lead to some clashes with libraries (but that depends of course on use-case). As an alternative you could use Docker to get reproducible build.
my personal device has ~4gb of space left on it, so installing ubuntu isn't an option. also, I have everything set up in arch how I like and I really don't want to deal with gnome or having to reboot just for doing this stuff. I'm able to debug issues that arise from using arch, so I'm just dealing with it
While I agree that building from source can be painful, based on internal discussions and general package release policy for ROS (API breaking only for new distros), we will currently support DepthAI V3 only for Kilted release, as having multiple packages for different versions could introduce issues/confusion for users.
would it at least be possible to support Jazzy? As, the NVIDIA Jetson currently only supports Humble but soon™ will support Jazzy. I have no clue when Kilted will be supported
my personal device has ~4gb of space left on it, so installing ubuntu isn't an option. also, I have everything set up in arch how I like and I really don't want to deal with gnome or having to reboot just for doing this stuff. I'm able to debug issues that arise from using arch, so I'm just dealing with it
Basic Kilted Docker image with depthai-ros installed is ~1.8GB so it should fit. I've also tested it and was able to view data coming from the image in ROS2 Humble environment.
Basic Kilted Docker image with depthai-ros installed is ~1.8GB so it should fit. I've also tested it and was able to view data coming from the image in ROS2 Humble environment.
Does this mean that the depthai-kilted package would work with ROS2 Jazzy? Jazzy is way more interesting than Kilted as it is LTS and Kilted is not.
@kallegrens I also tested with Jazzy and it also works
Basic Kilted Docker image with depthai-ros installed is ~1.8GB so it should fit. I've also tested it and was able to view data coming from the image in ROS2 Humble environment.
I tried messing around with a docker image for ROS, and it took like 2 hours to build and wasn't even done, so I just gave up at that point. also docker is quite annoying to work with for development.
@Serafadam Would you be willing to release it as a jazzy package then? Would be much appreciated in the community that only works with LTS releases.
... yeah. There really isn't much packages that works with Kilted... Shouldn't you have to test it against Jazzy and Humble? Kilted goes EoL faster then Humble, this package is not usable after a year...
@Serafadam I 100% agree with everyone in this thread saying it should be a jazzy package as well, there is no point in it being a kilted package as no one is actually going to use kilted in the field since it is eol next year. I see that you said it works with jazzy so maybe having it to where you could just use apt install ros-jazzy-depthaiv3-ros would actually make this accessible to everyone who would like to use it easily and insure support.
/oak/left/camera_info is zeros when using low bandwitdth mode:
/oak:
ros__parameters:
pipeline_gen:
i_enable_sync: false
i_nn_type: none
i_enable_rgbd: true
i_pipeline_type: rgbd
stereo:
i_depth_preset: DEFAULT
# uncomment for better performance in RVC2 PoE devices
i_publish_topic: false
i_synced: false
i_left_rect_publish_topic: false
i_right_rect_publish_topic: false
rgb:
i_synced: false
i_fps: 5.0
i_publish_topic: true
left: &left
i_publish_topic: true
i_synced: false
i_resolution: 1200P # the only option for the AR0234 sensor (OAK-D LR)
i_width: &stereo_width 768 # maximum is 768
i_height: &stereo_height 480 # maximum is 480
i_set_isp_scale: true
i_isp_num: 2
i_isp_den: 5
i_low_bandwidth: true
i_low_bandwidth_quality: 80
i_fps: 5.0
right: *left
If I change left.i_low_bandwidth to false, camera info is correct.
Launching via ros2 launch -a depthai_ros_driver rgbd_pcl.launch.py params_file:=/tmp/a.yaml .
Example received message:
header:
stamp:
sec: 0
nanosec: 0
frame_id: ''
height: 0
width: 0
distortion_model: ''
d: []
k: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
r: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
p: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
binning_x: 0
binning_y: 0
roi:
x_offset: 0
y_offset: 0
height: 0
width: 0
do_rectify: false
Camera: OAK-D-LR connected via Ethernet.
Same with OAK-D-SR-POE with stereotof pipeline.
Even header.stamp is all zeros.
The camera info problem is probably solved by https://github.com/luxonis/depthai-ros/pull/767 .