matnwb
matnwb copied to clipboard
NWBDataInterface object in scratch
I am using an extension to add groups of groups under /scratch to replicate the hierarchy of the icephys extension.
The base container inherits from NWBDataInterface, yet MatNWB (but not pynwb) complains because the type is not ScratchData.
Not sure if this is an issue on my side or a harsh constraint in matnwb.
Example file: test_icephys_file.h5.zip
Error using types.util.checkConstraint (line 21)
Property `scratch.data_organization` should be one of type(s) {types.core.ScratchData}.
Error in types.util.checkSet>@(nm,val)types.util.checkConstraint(pname,nm,namedprops,constraints,val) (line 10)
@(nm, val)types.util.checkConstraint(pname, nm, namedprops, constraints, val));
Error in types.untyped.Set/validateAll (line 111)
obj.fcn(mk, obj.map(mk));
Error in types.util.checkSet (line 11)
val.validateAll();
Error in types.core.NWBFile/validate_scratch (line 464)
types.util.checkSet('scratch', struct(), constrained, val);
Error in types.core.NWBFile/set.scratch (line 315)
obj.scratch = obj.validate_scratch(val);
Error in types.core.NWBFile (line 197)
obj.scratch = p.Results.scratch;
Error in types.ndx_icephys_meta.ICEphysFile (line 24)
obj = [email protected](varargin{:});
Error in io.parseGroup (line 80)
parsed = eval([Type.typename '(kwargs{:})']);
Error in nwbRead (line 33)
nwb = io.parseGroup(filename, h5info(filename), Blacklist);
This appears to a parsing issue, as a neurodata_type_def appears to hold precedent over other neurodata_type_inc definitions. I'm unfortunately strapped for time but I can at least investigate.
A workaround will require inheriting from ScratchData for every type for right now.
That said, the default spec appears to require inheriting from NWBContainer, DynamicTable, or ScratchData so simply inheriting from NWBInterface won't work either, even if the previous issue is resolved:
https://github.com/NeurodataWithoutBorders/matnwb/blob/dba6d09f3fd6f142884f0d76ce5b1cb273e822a1/nwb-schema/core/nwb.file.yaml#L83-L98
For a future note, this will require some work as the current generation method assumes a single hdf object type (i.e. Group, Link, Dataset) for each property. This is not necessarily true in evidence of scratch data.
For the backend, this will require some level of rewrite for how low-level schema analysis works.