lua---mattorch
lua---mattorch copied to clipboard
Saving big .mat file
Hi,
I just found an issue while saving > 2G data to .mat file. A = torch.Tensor(2000,200000):randn(2000, 200000) matorch.save('test.mat', {A=A})
Use mattorch.save('file.mat',variable,'-v7.3')
I hope this will work.
I still get the same error: C++ exception
I am also unable to save large (> 2 GB) files to .mat
Though I am not getting any error and the command
mattorch.save('data.mat', variable)
gets executed without any error, but on inspecting I find that the mat file created doesn't contain any data. It's empty.
@kmul00 have you tried: mattorch.save('file.mat',variable,'-v7.3')
@soumith Yeah, I did. But that also didn't serve the purpose.
Also, to make sure, I checked the memory usage. RAM isn't the problem.
@kmul00 ugh! that sucks. Maybe try this function/lib https://github.com/soumith/matio-ffi.torch#save-a-tensor-or-a-set-of-tensors-to-a-mat-file
i dont have matlab anymore to help.
you could also load hdf5 files in matlab: https://github.com/deepmind/torch-hdf5
@soumith Thanks for the hd5f link. I'll have a look at it.
Also, previously I used to use matio only. I found it advantageous over mattorch since it allowed me to save any type of data (float, double, byte) to matlab format, unlike mattorch which needs the data to be converted to double taking up huge chunks of memory. But few days ago I discovered that while saving torch data to matlab format using matio I am losing a considerable amount of data. The values are simply becoming zero. It seemed to be a bug in matio. (This happens only when the size of data is large) I didn't have the time back then to inspect the reason or reproduce the error and raise an issue, hence switched to mattorch quickly.
Thanks for bringing it up. I will reproduce the error and start an issue in the matio repository.