Issue with Large Point Cloud (40GB+) Conversion Using PotreeConverter
We are experiencing an issue when converting large point clouds (PCL) generated by Metashape to Potree format using PotreeConverter. Specifically, point clouds with sizes exceeding 40GB are not being fully generated, whereas smaller classified point clouds (e.g., 19GB) convert successfully.
Details: Input 1: Point Cloud (60GB) - Conversion incomplete. Input 2: Classified Point Cloud (19GB) - Conversion complete. Logs: No errors are logged during the conversion process. The problem seems to occur only with larger point clouds. We are looking for guidance on potential causes and solutions to ensure successful conversion of large point clouds.
The underlying LasZip library does not support more points than uint32. See https://github.com/potree/PotreeConverter/pull/615.
Hi, thanks for replying.
Yes, PCL has 6.2 billion points. How can we solve this problem ? Can you share some approaches ?
Easiest would probably be to first split it into smaller parts using PDAL for example.
Can you give detailed steps ? Once we split, does Portree-conveter takes 2 PCL as input arg and output 1 single metadata.json from that ? How we will merge the outputs from 2 PCLs ?
I successfully converted big files but they are splited before. For example 2 laz 32,8Gb & 24,3Gb
#points: 5'733'177'381
total file size: 57.2 GB
Cmd potreeConverter ./ -o .\potree\ --encoding BROTLI
=======================================
=== STATS
=======================================
#points: 5'733'177'381
#input files: 2
sampling method: poisson
chunk method: LASZIP
input file size: 57.2GB
duration: 2'541.646s
throughput (MB/s) 23MB
throughput (points/s) 2.3M
output location: D:\temp\potree2
duration(chunking-count): 188.133
duration(chunking-distribute): 354.967
duration(chunking-total): 543.138
duration(indexing): 1'997.624
host is my laptop with i9-14900HX with 48RAM