Ryan Ly
Ryan Ly
@mavaylon1 do you think you could take this on?
@mavaylon1 Yes, this is low-priority and not that important. If you could in a few weeks, please spend 10 min looking into how hard it would be to do this....
Fixed by #1198 and #1212 and #1221 `HDF5IO.copy_file` is kept but we can target its removal in HDMF 5.0.
I like the idea of passing it in to the constructor. That is effectively how LinkML uses their `tree_root` property and it makes sense. Also, the root object doesn't have...
@oruebel At least in HDF5, the name of the root container is not written. Does Zarr write the root container name to disk?
This is true not just for extensions, but for the core schema too. For example in an NWB 2.0.2 file, in the cached spec for `ImageSeries`, the specs for the...
Another example to note: if an extension yaml file has no data types, as it is here: namespace: ```yaml - name: hdmf-experimental ... full_name: HDMF Experimental schema: - namespace: hdmf-common...
This came up again in https://github.com/NeurodataWithoutBorders/nwb-schema/issues/622 The resolve_spec behavior in HDMF needs to be adjusted so that dtype, shape, requiredness, and other properties are inherited by the child spec if...
It is possible to create an HDF5 attribute containing multiple references. ```python >>> import h5py >>> myfile = h5py.File('myfile.hdf5', 'w') >>> myfile.create_group("test_group") >>> myfile.attrs.create("attr_of_refs", data=[myfile.ref, myfile["test_group"].ref], dtype=h5py.ref_dtype) >>> myfile[myfile.attrs["attr_of_refs"][0]] >>>...
Check `container_source`