3dfier icon indicating copy to clipboard operation
3dfier copied to clipboard

CityJSON creation silently fails on big datasets

Open vvmruder opened this issue 5 years ago • 2 comments

Trying out 3dfier on a dataset of approximately 6000 polygon features and LAS data around 400 million points end up in silent exit of 3dfier with simple std::bad_alloc output. Tracing this down leads to static packaged JSON library https://github.com/tudelft3d/3dfier/blob/master/thirdparty/nlohmann-json/json.hpp All process until file creation was going correctly as far as I interpret the output.

I tried this on a machine with 16gb of RAM. Clearly this is the bottleneck. But I think there could be some improvement. Maybe writing to the file not in one dump but in steps? Or at least catch this a bit better to give some hint why this fails.

Running the same on a machine with 48gb of RAM works through all process and results in proper output file.

vvmruder avatar Nov 04 '19 17:11 vvmruder

I haven't found this issue yet. I did create datasets with more then 6000 polygons into CityJSON output before but not a lot. What was the output size of the CityJSON file on the 48gb RAM machine? Large file output with CityGML is fine since we implemented it ourselfs by writing all to a file stream. Not sure how the NLohmann json package does the trick. I imagine it keeps the complete object in memory and writes to disk once at the end.

tcommandeur avatar Nov 06 '19 10:11 tcommandeur

Resulting file was around 1.9gb.

vvmruder avatar Nov 06 '19 10:11 vvmruder