using kinect instead of lidar?
in your example code you use lidar = myLidarModel() and then in the loop: scan = readLidar()
Any chance for replacing that with calls to kinect depth data?
That sounds like an excellent new feature, but it would unfortunately involve quite a bit of work. Currently, BreezySLAM is set up to work with a rotating lidar that returns a sequence of distances, each associated with a horizontal scan angle. This is pretty much the opposite of Kinect, which returns a 2D image (vertical, horizontal) of distances based on a fixed-orientation scan.
Hi Simon Thanks for your answer. Assuming I have the depth map I could generate the distance/angle data for a single vertical row and fill the missing angles with "None"-distancies? As I have a movement mode where the whole robot rotates 360 degrees I could
- with a big timelag - emulate the lidar? Will it help that I could also provide the distance data for each of the 240 vertical angles too? Can you point me to a description of what data structure and values I would need to provide to emulate the lidar data or do you consider this a non-working attempt? Regards, Juerg
2018-03-08 19:30 GMT+01:00 Simon D. Levy [email protected]:
That sounds like an excellent new feature, but it would unfortunately involve quite a bit of work. Currently, BreezySLAM is set up to work with a rotating lidar that returns a sequence of distances, each associated with a horizontal scan angle. This is pretty much the opposite of Kinect, which returns a 2D image (vertical, horizontal) of distances based on a fixed-orientation scan.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/simondlevy/BreezySLAM/issues/20#issuecomment-371579267, or mute the thread https://github.com/notifications/unsubscribe-auth/AD0_CPRZ4SYrX3ONdtDigE480jxLJ9QWks5tcXjHgaJpZM4SilpT .
-- Jürg Maier Trungerstrasse 33 9543 St. Margarethen Schweiz [email protected]