helios
helios copied to clipboard
Confusion regarding the angle definition of the scanner
Hi,
I am trying to simulate some UAV-based LiDAR data on BIM models. I am a bit confused about the axis and definition of different angles. I read through the Wiki page but could not understand things clearly. Sorry for that. I have a few questions as follows:
- For a scanner mounted on a quadcopter, what is considered the positive X-axis? Is it the direction of motion of the quadcopter or the X-axis of the scene coordinate system?
- What does the rotation of beamOrigin mean? Does this mean the rotation of the starting point from where the beam originates? Then how is it different from headRotateStart_deg, headRotateStop_deg, verticalAngleMin_deg and verticalAngleMax_deg. To my understanding, these parameters also define the start and end of the rotation of the scanner, which defines its field of view.
- On the wiki page about scanners(https://github.com/3dgeo-heidelberg/helios/wiki/Scanners), I can see two airplanes with different coordinate systems.
For the first image, it says that "If no rotation is provided, the scanner looks to the front of the platform and scans up-down:". For the second one, it says that "The x, y, z coordinates of the position of the origin of the laser beam are set in the following coordinate system:". I am wondering why both images have different coordinate systems. By rotating the beamOrigin, are we rotating the coordinate system altogether?
Sorry if the questions are a bit silly. I am kind of new to this area, so it is taking a bit of time to get familiar with all the technicalities. Any answers are appreciated. Thanks in advance.
Best Regards, Ajay
Hi @AJAY31797 ,
thanks for your questions.
- the x-axis is pointing towards the right wing (i.e., the positive y-axis is pointing forward). This is a so-called RIGHT-FORWARD-UP (RFU) definition of the coordinate system. Note that any rotations in the
platform
XML are applied afterwards and may result in a different sensor orientation. - the rotation of the beam origin rotates the initial beam origin. This is e.g. useful if you need to change the scanline to scan at +5 deg from the horizon, but still rotate about the z-axis. For an example, see the Velodyne Scanners e.g. here: https://nbviewer.org/github/3dgeo-heidelberg/helios/blob/dev/example_notebooks/12-multi_scanner_puck.ipynb In contrast, headRotateStart/Stop and verticalAngleMin/Max are values set for a specific leg of the scan in the resulting polar coordinate system.
- the first image defines the coordinate system used for rotation of the scanner (rotations set in the
XML/Tags). The second image defines the coordinate system used for position the beamOrigin
tag within the scanner.
In general, I suggest using the box survey for experiments with these settings, as you will most easily see what is going on there.
Cheers, Lukas
Hi Lukas,
Many thanks for your response. Just one more query. So, based on my understanding of the provided description of platforms and sensors, the quadcopter always has a linear path, which means it performs the scan throughout its flight on a linear path. Is there a way to keep the quadcopter/UAV stationary and generate the point cloud of what it sees below it...something like the figure below:
(Image Credit: https://www.skylarkdrones.com/blog/make-the-most-of-your-drone-data-with-cloud-based-drone-data-processing-).
The problem I am facing is I want to perform 3D object detection on the construction site. Although I can use the UAV to run along a path and get the site point cloud, the generated point cloud will have points with respect to multiple locations of the UAV. So, each point can have different coordinates based on the location of the UAV from where it is being scanned. So annotating the data for training in that case will be very difficult.
Best Regards, Ajay
Hi @AJAY31797 , the image you linked shows something similar to a photographic capture, not to a laser scanner. I'm not sure what you would expect to see from a scan where the UAV does not move over the surface, unless you (a) use a scanner like the Livox Avia that has a 2D scan pattern and (b) you expect changes in the scene over time. Typically, scanners for UAVs are line-based and just scan to the left and the right of the drone, so the forward motion is what provides you with the second dimension. See e.g. the first image here: https://www.flyability.com/lidar-drone
Perhaps you can give more information about the scanner and scenario you are wanting to simulate?
Cheers, Lukas