myria3d
myria3d copied to clipboard
Failure creating a larger hdf5 dataset
Whenever creating a larger hdf5 dataset, i.e. approximately more than 4 lidar HD tiles for training set, the resulting hdf5 file collapses to a few kilobytes without any explanation. The RAM size might play a role in this, I have 32GB and the process has to utilize swap partition in order to create a larger dataset. But even if it completes without any apparent error, the hdf5 file is tiny and when running the RandLa experiment, it attempts to create it again. Some error then follows. What is happening? If it is because of the RAM size, is there any way to circumvent it?