Jeff Whitaker
Jeff Whitaker
It's a netcdf-c lib issue - at some point it started to return a non-zero error code. We could just ignore the error code, but I'm not sure the variable...
Here is the relevant netcdf-c PR: https://github.com/Unidata/netcdf-c/pull/2231. This went into version 4.9.0.
There are other types that compression will not work for besides VLTypes - seems like if this warning were added it should be more general.
Yes, I would be open to a PR. Have to look at the HDF5 docs to see what datatypes support compression. Certainly all the primitive datatypes, not sure about Compound...
OK - I wasn't sure if Enum and Compound types supported compression but I guess they do. So you just need to check for variable-length types.
Thanks! Looks good - but why did you add a new workflow to run stubtest, instead of adding the extra test to the existing workflows?
Looks you like need to add `git submodule update --init --recursive` after the `git clone` in `stubtest.yml`.
> A sidenote: > > * I have removed the `_netCDF4` module path from the reprs, I doubt that should affect anyone but one could consider this a breaking change....
> > @headtr1ck please see my questions in the review > > I don't see a review? Are you sure you published it? oops, just submitted it
> @jswhit any reason, `__has_parallel_support__ ` and `__has_ncfilter__` were not imported at `netCDF4` level (only available in `netCDF4._netCDF4`? I imported them now, but can undo that. No reason, just an...