Draco crashes on big datasets located far away from the origin
Hello!
I am not certain if this is a well-known problem, but I faced an issue while trying to use Draco (current master) with big point cloud sets, that are also located far away from the origin (i.e., they have big values of the coordinates).
Here an example of a file (the link will expire in a few days, but I can resend it if necessary):
https://filesender.renater.fr/?s=download&token=b69c47b6-7266-4724-81f5-4f0b4dcb595e
The error message I got:
[kpluta@deepsat Downloads]$ draco_encoder -point_cloud -i Laserscan_2018_cut.ply -o out.drc
terminate called after throwing an instance of 'std::length_error'
what(): vector::reserve
Aborted (core dumped)
PS The file comes from a public dataset related with the publication: Adaptive feature-conserving compression for large scale point clouds by Eickeler et ali.
@copyme Your input file is too large for Draco to load into memory. The exception is being thrown when attempting to allocate storage to read the input file.
@tomfinegan that is what I was worried about. Are there any plans for mitigating this limitation?