koide3
koide3
Hi @huyusheng123 , We tested this package with HDL32E and VLP16E. I suppose that you can use HDL64E and any other 3D LIDARs as well.
Hi, You can find the paper at the following link: https://www.researchgate.net/publication/331283709_A_Portable_3D_LIDAR-based_System_for_Long-term_and_Wide-area_People_Behavior_Measurement
Hi @libinJiang , If you use a static LIDAR, you can try ```hdl_people_tracking_static.launch``` that uses the very first input frame as the environment map.
The standard workflow of this package requires localization. But, in case the sensor is fixed, you can run it by feeding dummy static transforms to the nodelet instead without running...
Here, a test bag for people tracking with a static velodyne. https://drive.google.com/open?id=1oPj6xJ0VpvGc_u31EQuGAqiVMnbA5Kar Try to launch "hdl_people_tracking_static.launch" with this bag. The first frame is treated as the background cloud. ```bash rosparam...
This package requires a map cloud ("hdl_localization/data/map.pcd" in the example) to perform people tracking with a moving sensor, and in our workflow, we use hdl_graph_slam to generate the map cloud....
Yes, you need to create an environmental map with any SLAM algorithm before running this people tracking. In case you want to run it in an unmapped environment, you need...
Hi @Zhang-Qinghui , We mainly use 32 and 16-line velodyne LIDARs (HDL32e and VLP16).
If you are referring to the video on youtube, it's a standard 32line LIDAR, and we didn't do anything special.
Some parameters are tuned for 32line, and you may need to tweak them for your sensor. Please see the launch file to find the params to be tuned.