You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Whenever creating a larger hdf5 dataset, i.e. approximately more than 4 lidar HD tiles for training set, the resulting hdf5 file collapses to a few kilobytes without any explanation. The RAM size might play a role in this, I have 32GB and the process has to utilize swap partition in order to create a larger dataset. But even if it completes without any apparent error, the hdf5 file is tiny and when running the RandLa experiment, it attempts to create it again. Some error then follows.
What is happening? If it is because of the RAM size, is there any way to circumvent it?
The text was updated successfully, but these errors were encountered:
Whenever creating a larger hdf5 dataset, i.e. approximately more than 4 lidar HD tiles for training set, the resulting hdf5 file collapses to a few kilobytes without any explanation. The RAM size might play a role in this, I have 32GB and the process has to utilize swap partition in order to create a larger dataset. But even if it completes without any apparent error, the hdf5 file is tiny and when running the RandLa experiment, it attempts to create it again. Some error then follows.
What is happening? If it is because of the RAM size, is there any way to circumvent it?
The text was updated successfully, but these errors were encountered: