Charlie Zender

Results 187 comments of Charlie Zender

@amschne my understanding is that this dataset provides initial conditions including the spun-up 16-layer snow/firn model and that users of it thereby avoid needing to spin-up the 16-layer firn from...

Thanks for that report. NCO breaks while trying to copy the string variables. Not sure why, possibly because they are record variables that NCZarr does not yet support? In any...

NCO defines some one-liner functions that access global variables, like `nco_prg_nm_get()`, in a part of the `nco.h` header only visible to source files that include `main()`, e.g., `ncks.c`. Thus the...

https://stackoverflow.com/help/how-to-ask

It sounds like you need to use `ncrcat` instead of `ncecat`. The manual explains the differences [here](http://nco.sf.net/nco.html#cnc).

If you do not have the `ncrcat` executable you can run `ncra -Y ncrcat ...` instead. Or you can copy `ncra.exe` to `ncrcat.exe`. It's one executable that behaves differently depending...

Hi All. @mauzey1 the `ncdump` symptom you report above could indeed be due to using an `ncdump` that is not linked to the zstandard library, as suggested by @durack1. First,...

That all looks nominal. Last requirements to check are contents of `plugindir` whether `ncdump` client actually searches there. Please post results of 1. `echo ${HDF5_PLUGIN_PATH}` and 2. `ls -l /Users/mauzey1/opt/anaconda3/envs/cmor_dev/lib/hdf5/plugin`

@durack1 Good catch. I was focused on the plugin libraries for compression/decompression because that is usually the issue. I suppose the reported lack of quantization support could also mess things...

@mauzey1 FWIW my Conda-forge installation of netCDF 4.9.2 also reports ` --has-quantize -> no`, and it correctly dumps Zstandard-compressed (and quantized) datasets. The library names in my installation appear identical...