[BUG] camera gets stuck with ros
Check if issue already exists
- Google it (e.g. error xy github luxonis depthai)
- Check troubleshooting in documentation.
Describe the bug A clear and concise description of what the bug is.
Minimal Reproducible Example Append the MRE to the bug report, instructions here
If available launch files don't work in your case, please check if you also get errors while running:
-
stereo_inertial_nodeindepthai_examples -
cameraindepthai_ros_driver - In case both fail to run, please check if you can run the default python demo app
Expected behavior A clear and concise description of what you expected to happen.
Screenshots If applicable, add screenshots to help explain your problem.
Pipeline Graph
Please also provide a screenshot of your pipeline using the DepthAI Pipeline Graph.
You can save it in depthai_ros_driver, either by calling /save_pipeline ROS service, or by setting parameter camera.i_pipeline_dump in ROS 2 or camera_i_pipeline_dump in ROS. Pipeline dump is saved to /tmp/pipeline.json.
Attach system log
- Provide output of log_system_information.py
- Which OS/OS version are you using?
- Which ROS version are you using?
- Which ROS distribution are you using ?
- Is
depthai-rosbuilt from source or installed from apt? - Is
depthai/depthai-corelibrary installed from rosdep or manually? For rosdep install, check ifros-<rosdistro>-depthaipackage is installed, manual install can be checked withldconfig -p | grep depthai - Please include versions of following packages -
apt show ros-$ROS_DISTRO-depthai ros-$ROS_DISTRO-depthai-ros ros-$ROS_DISTRO-depthai-bridge ros-$ROS_DISTRO-depthai-ros-msgs ros-$ROS_DISTRO-depthai-ros-driver - To get additional logs, set
DEPTHAI_DEBUG=1and paste the logs, either from command line or from latest log in~/.ros/log
Additional context Add any other context about the problem here.
Hi, could you fill out the debug information?
When I'm trying to run a ROS program with the camera (OAK-D pro), the video from the relevant topics appears stuck. Meaning it captures frames much slower compared to any other camera. Any ideas on how to solve it?
On what distribution does this happen? What is you PC configuration and network setup? If using DDS, which one? Does this happen when running depthai_ros_drivers' executable, or stereo_inertial_node? Is the repository built from source or downloaded from APT?
Hello, I'm having a similar problem, when I use python3 -m depthai_viewer the rgb, depth and pointcloud seems pretty stable and fast, but when I use ros2 launch depthai_ros_driver camera.launch.py the rgb, depth and pointcloud start to get a lot of lag.
- RMW_IMPLEMENTATION:
rmw_cyclonedds_cpp - depthai-core compiled mannually on
mainbranch - depthai-ros commit:
0022aa3c70084054cbe2b9d36c8ec28ca9f5b4c0 - Jetson Orin AGX 64gb
- OAK-D-PRO-W-POE
Using the following parameters:
/oak:
ros__parameters:
camera:
i_nn_type: none
i_pipeline_dump: false
rgb:
i_low_bandwidth: true
i_resolution: 720P
left:
i_low_bandwidth: true
right:
i_low_bandwidth: true
stereo:
i_align_depth: true
i_board_socket_id: 2
i_subpixel: true
i_right_rect_publish_topic: true
i_right_rect_synced: false
i_low_bandwidth: true
i_decimation_filter_decimation_factor: 8
Pipeline:
Hi, this is most likely the result of the transport layer, unfortunately in ROS2 transferring large images/pointclouds has additional overhead. This can be reduced either by tuning DDS , using Composition mechanism (unfortunately not available for Rviz at the moment). Additionally, currently we calculate pointclouds using ROS2 node, and in the near future (ROS2 Kilted) there will be a separate RGBD node in DAI ecosystem that will improve performance greatly (it has option to use multiple cores or GPU for calculation). What could also probably help in your case is turning on sync between rgb and stereo might help here since then those messages have the same timestamps which helps nodes further in the chain with synchronization.
Hi, thanks for your insights! Currently, we are using three RealSense cameras per robot, running image, depth, and pointcloud streams simultaneously. We are not using composable nodes, and our setup is based on rmw_cyclonedds_cpp. Despite the additional overhead in ROS 2, all cameras are running in real-time without issues.
We appreciate the suggestions regarding DDS tuning and synchronization between RGB and stereo. We'll keep an eye on the upcoming RGBD node in ROS 2 Kilted, as it sounds like a promising improvement for performance.
@BryanBetancur did you use USB or PoE RealSense cameras? One additional overhead can come from the bandwidth, PoE cameras usually have 1GBPs transfer (unless you use external network card). You could also try experimenting with other encoding types (default is MJPEG but you can also use H264) you can change encoding type with i_low_bandwidth_profile (by default 4) parameter. Also, by default encoded frames are converted into regular frames on host, you can skip that with i_publish_compressed: true, although then you need to subscribe to compressed topic separately.
Some more information on bandwidth here: https://docs.luxonis.com/software/depthai/optimizing/