file type discovery for the parquet format
In https://github.com/zarr-developers/VirtualiZarr/pull/251#discussion_r1802972887, we've been discussing the detection of the parquet reference files.
This turns out not to be easy: while the directory usually has a .parquet suffix, the directory structure does not lend itself to quick checks.
Since the format currently does not have version information, I wonder if it would be possible to change fsspec's LazyReferenceMapper to write a small file (e.g. version.json or kerchunk.json) into the root directory of the "zarrquet" file? That would have the advantage of versioning the file format, and would also make the detection easier.
cc @norlandrhagen, @TomNicholas
Given the new manifests coming, I'm not sure if this is a good time to do this. A couple of thoughts:
- we already have a .z metadata file, so versioning information can go in there
- any extra json (or other) files are against the parquet conventions
- we actually use the presence of *.json file(s) to assume "not parquet", although that should be in the input path name
Given the new manifests coming, I'm not sure if this is a good time to do this.
I actually agree. Changing the file format would only help people who write new references in the kerchunk format, but now that icechunk exists I would recommend that people write references in that format from now on instead.
I would recommend that people write references in that format
You still need to kerchunk-scan the original files at some point - how would you write these references?
all I'd really need is some way to decide whether a directory named ".parquet" is a "kerchunk reference file".
If we can rely on .zmetadata to exist, that in combination with the directory suffix may already be enough?
Otherwise I agree that if we have that we don't really need another file and can just put the version info in there (something like {"format": "kerchunk-parquet", "version": 1, "metadata": ...} would already be sufficient).
I would recommend that people write references in that format
I don't think we'll be able to convince everyone to switch immediately, so having good tool support for the existing formats is still important.
If we can rely on .zmetadata to exist, that in combination with the directory suffix may already be enough?
Yes, I think so
You still need to kerchunk-scan the original files at some point - how would you write these references?
If you use VirtualiZarr then Kerchunk gets called to do the scanning, then the references are in memory (as wrapped ManifestArrays), then you write them out straight to Icechunk. With that workflow at no point do you write out references using the kerchunk json or parquet formats.
import virtualizarr as vz
vds = vz.open_virtual_dataset(file.nc) # uses kerchunk's SingleHDF5ToZarr under the hood, references are now in-memory
vds.virtualize.to_icechunk(icechunkstore) # writes directly to Icechunk's on-disk format