o3de-rgl-gem
o3de-rgl-gem copied to clipboard
Simulating distortion caused by rotation
The current behavior seems to be that the entire scene is captured at once regardless of the sensor frequency. Real rotating lidar sensors collect points over time which causes distortion when objects are moving relative to the sensor.
Would it be possible to simulate distortions caused by movement? How big would the performance hit be? Would the limiting factor for "subsections" be the frame rate? Point specific time stamps in the ROS message would be great with this feature.