matnwb
matnwb copied to clipboard
issue exporting NWB file after it has already been created
I would like to do the following:
- read in an nwb file
- do something with it like create a new segmentation plane attached to an imaging plane
- and export the file again
However, it seems I am unable to do the simpler task of just reading in the file and exporting it again. The following MATLAB code
inputFilePath = '/Users/cesar/Documents/CatalystNeuro/matNWB/EXTRACT-interface/copy_sub-F1_ses-20190407T210000_behavior+ophys.nwb';
% Load File
nwb = nwbRead(inputFilePath);
% Export File
nwbExport(nwb, 'dummy_export.nwb');
produced this error:
Error using hdf5lib2
The HDF5 library encountered an error and produced the following stack trace information:
H5G__loc_find_cb object 'data' doesn't exist
Error in H5O.open (line 27)
output = H5ML.hdf5lib2('H5Oopen',loc_id,relname,lapl_id);
Error in io.writeAttribute (line 8)
oid = H5O.open(fid, path, 'H5P_DEFAULT');
Error in types.core.TimeSeries/export (line 297)
io.writeAttribute(fid, [fullpath '/data/conversion'], obj.data_conversion);
Error in types.core.ImageSeries/export (line 199)
refs = [email protected](obj, fid, fullpath, refs);
Error in types.core.TwoPhotonSeries/export (line 132)
refs = [email protected](obj, fid, fullpath, refs);
Error in types.untyped.Set/export (line 181)
refs = v.export(fid, propfp, refs);
Error in types.core.NWBFile/export (line 740)
refs = obj.acquisition.export(fid, [fullpath '/acquisition'], refs);
Error in NwbFile/export (line 61)
refs = [email protected](obj, output_file_id, '/', {});
Error in nwbExport (line 35)
export(nwb(i), filename);
I tried using the same file name for exporting, but that just throws a more ambiguous error
Error using hdf5lib2
The HDF5 library encountered an unknown error.
Error in H5D.write (line 100)
H5ML.hdf5lib2('H5Dwrite', varargin{:});
Error in io.writeDataset (line 35)
H5D.write(did, tid, sid, sid, 'H5P_DEFAULT', data);
Error in types.core.NWBFile/export (line 753)
io.writeDataset(fid, [fullpath '/file_create_date'], obj.file_create_date, 'forceChunking', 'forceArray');
Error in NwbFile/export (line 61)
refs = [email protected](obj, output_file_id, '/', {});
Error in nwbExport (line 35)
export(nwb(i), filename);
I found this thread for PyNWB and it seems the issue is resolved there with the addition of an .export method to NWBHDF5IO. Any thoughts on how to approach this in matnwb? Any tips appreciated
digging in a bit further into this. this doesn't seem to be an issue with the file created by the ophys tutorial. Cross-version compatibility is not an issue. Ophys tutorial file can be generated, read in and re-exported with different matnwb version. Perhaps it has to do with the fact the original file has DataPipe objects while the ophys tutorial file does not. I'm gonna try out another file from the DANDI archive, in case it's something funny about this file.
@cechava yes, this should be possible in general. The DataPipe
causing problems is a good guess. Let us know if that ends up being the problem.
@bendichter indeed it appears DataPipe
is the issue. Specifically, an error appears when trying to export a file with an acquisition containing a DataPipe
object as data. No issue exporting if the DataPipe
is in another part of the file such as the interval_trials field.
@cechava got it. Could you change the title of this issue or create a new issue and put together a minimal snippet of code that completely reproduces the error?
closing out and creating a new issue.
there was an independent bug in my code that led me to believe the DataPipe
object was the issue. It's not entirely clear what the issue is with re-exporting the DANDI file used in the original post. The error does seem to point at an issue with the data in the acquisition field. Moreover, going through the process with another DANDI-downloaded file that did not have data in the acquisition field was without issue.
Interesting. I suggest paring down the code section by section to get a minimal snippet that reproduces the error. If you can post that and the full traceback I might be able to help you figure out what is going on.
The following snippet of code replicates the error. The file path should be replaced to the local location of the file downloaded from here
inputFilePath = '/Users/cesar/Documents/CatalystNeuro/matNWB/EXTRACT-interface/sub-F3_ses-20190414T210000_obj-7vehp4_behavior+ophys.nwb';
nwb = nwbRead(inputFilePath);
nwbExport(nwb, inputFilePath);
The output error
Warning: Attempted to change size of continuous dataset `/file_create_date`.
Skipping.
> In io.writeDataset (line 26)
In types.core/NWBFile/export (line 753)
In NwbFile/export (line 61)
In nwbExport (line 35)
Error using hdf5lib2
The HDF5 library encountered an unknown error.
Error in H5D.write (line 100)
H5ML.hdf5lib2('H5Dwrite', varargin{:});
Error in io.writeDataset (line 35)
H5D.write(did, tid, sid, sid, 'H5P_DEFAULT', data);
Error in types.core.NWBFile/export (line 753)
io.writeDataset(fid, [fullpath '/file_create_date'],
obj.file_create_date, 'forceChunking', 'forceArray');
Error in NwbFile/export (line 61)
refs = [email protected](obj, output_file_id, '/', {});
Error in nwbExport (line 35)
export(nwb(i), filename);
Some further info:
- creating a dummy file in the manner indicated by the ophys tutorial and performing this process does not generate and error. Even when the dummy file has
DataPipe
objects - the issue persists for different files from this DANDIset. However, there's no issue going through this process with files from this DANDIset, which doesn't have any acquisition data
- the error persists across different schema versions and MATLAB versions
- pynwb is able to read in and export the file without problem, so the issue is MATLAB-specific
Data from this DANDIset does have acquisition data and the file be read-in and re-exported. I think the issue is specific to the files from the Plitt/Giocomo DANDIset.
You could try deleting components of the file with h5py.
del dataset
and
del group
Both work. You could use that approach to pare the file down
On Fri, Oct 29, 2021 at 6:46 PM cechava @.***> wrote:
Data from this DANDIset https://gui.dandiarchive.org/#/dandiset/000048 does have acquisition data and the file be read-in and re-exported. I think the issue is specific to the files from the Plitt/Giocomo DANDIset.
— You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub https://github.com/NeurodataWithoutBorders/matnwb/issues/336#issuecomment-955087846, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAGOEETUBIU2BPYSVBBSQZLUJMP3XANCNFSM5G6IPVMQ .
-- Ben Dichter, PhD Data Science Consultant personal website http://bendichter.com/
Deleting the 'TwoPhotonSeries' acquisition group allows the file to be re-exported without issue. Do you know how this file was created? I'm wondering if there's something about how this file was created that is causing this issue. I tried creating test files with expandable datasets in the acquisition group but re-exporting those test files does not create an error.