apriltags
apriltags copied to clipboard
Explanation of how to use `apriltags` for beginners.
Copied from comment here: https://github.com/personalrobotics/apriltags/issues/8#issuecomment-234775839
@Asukamax commented 4 hours ago
Excuse me .I'm a newbie. I don't know how to use this package ,even I could build it ,run the launch file successful... When I run the apriltags.launch ,it seem like doing nothing? Is that I should run some node to make my camera active and publish the image imformation to topic? If is that , which package I could use?
Thanks a lot
@asukamax: You are correct, the apriltags node only processes image data, but it does not connect to a camera. You need to run one of several camera driver ROS nodes to get data from your camera and publish it on ROS.
The exact driver you need to use depends on your camera. For USB cameras, two common options are:
- http://wiki.ros.org/usb_cam
- http://wiki.ros.org/uvc_camera
The standard for cameras on ROS is to publish a subset of a few standard topics, such as:
/image_raw- an unmodified camera image/image_mono- a black and white version of the image/camera_info- information about the camera calibration
After your camera node is outputting these, you need to rectify the image in order to remove the distortions from the camera not being physically ideal. ROS provides another node for this, in the image_proc package:
http://wiki.ros.org/image_proc
This node will take in one of the above images, and use the information in /camera_info to produce a new image topic called /image_rect. This will be the same image, only warped to remove the camera distortions.
At this point, you can set the /camera_info and /image_rect topics in the apriltags.launch file so that your camera images are fed into the Apriltags node. The output will be a stream of detections, one message per frame.
You can read through many online ROS tutorials for setting up a camera, it should help explain more about what these nodes do and how to use them.
After your camera node is outputting these, you need to rectify the image in order to remove the distortions from the camera not being physically ideal. ROS provides another node for this, in the image_proc package: http://wiki.ros.org/image_proc
@psigen I was wondering about this, I seem to get better performance in terms of the pose estimation when I didn't give rectified images to the detector. My guess was that its because the call to solvePnP takes into account the distortion coefficients https://github.com/personalrobotics/apriltags/blob/master/src/apriltags.cpp#L118 if you feed the rectified image these coefficients should be zero instead.
That's a good point @andre-nguyen, we should probably document that.
Could a .bag also be provided? It would be nice to test against a known working set.
Hi, I also felt the usage of apriltags in ros is not well documented yet. So I tried to manage. Here https://github.com/xenobot-dev/apriltags_ros welcome comment. :)