netcdf4-python
netcdf4-python copied to clipboard
OSError: [Errno -101] NetCDF: HDF error when opening file
My setup: netCDF4 version: 1.5.3 Python version: 3.8.3 OS: Scientific Linux release 6.6 (work server, so can't change it)
Everything works fine when I use the Python interpreter (which makes me think the file is fine) but when I run a script with
from netCDF4 import Dataset
nc = Dataset(filename, 'r')
I get:
Traceback (most recent call last):
File "path/to/script.py, line 20, in <module>
nc = Dataset(filename, 'r')
File "netCDF4/_netCDF4.pyx", line 2321, in netCDF4._netCDF4.Dataset.__init__
File "netCDF4/_netCDF4.pyx", line 1885, in netCDF4._netCDF4._ensure_nc_success
OSError: [Errno -101] NetCDF: HDF error: b'path/to/file.nc'
Are you able to attach the file here? What does ncdump -h path/to/file.nc
report?
It's quite a huge file ~1GB (not sure why it's been made this big) so probably I'd have to cut it before attaching it. I have just tested I can open and read the file using h5py (both interactively and not)
ncdump -h path/to/file.nc
gives:
netcdf filename {
dimensions:
time = 30 ;
lat = 122 ;
lon = 720 ;
variables:
int64 time(time) ;
time:units = "days since 2008-01-01 01:30:00" ;
time:calendar = "proleptic_gregorian" ;
double lat(lat) ;
lat:_FillValue = NaN ;
double lon(lon) ;
lon:_FillValue = NaN ;
double var(time, lat, lon) ;
var:_FillValue = NaN ;
.....
Could this be a memory issue ? Maybe running interactively has a larger maximum memory that can be used ? I can probably check this.
@gcaria Check if you are perhaps trying to access the file on a network mounted FS that doesn't have file locking enabled. You may try setting the environment variable HDF5_USE_FILE_LOCKING=FALSE
. (Note that the capitalization of FALSE
is essential.) This solved a similar error message for me. The indicator is that I was able to open the file fine on a local machine, but not when accessed over a mounted network volume.
在我的代码中,我同时使用了代码:
from netCDF4 import Dataset
from pysteps import motion
后出现了“OSError: [Errno -101] NetCDF: HDF error”问题,当我删除“from pysteps import motion”后该问题消失,具体原因尚未找到。
English: In my code, I also use code:
from netCDF4 import Dataset
from pysteps import motion
An error occurred:“OSError: [Errno -101] NetCDF: HDF error”。When I deleted the code from pysteps import motion
,the error disappeared.The cause of the error has not been found.
Could this be a memory issue ? Maybe running interactively has a larger maximum memory that can be used ? I can probably check this.
Hi Giacomo,
Have you solved this issue? I am facing the same issue right now.
Just in case anybody still lands here...it appears that the above exception can occur in case the path to the file is wrong or not correctly formatted. So you might want to double check the path to the file. With the above example: I would guess that path/to/file.nc
does not exist but that the code with the path was C&Pied from some tutorial.
I was having this same issue and it is likely not a path or memory issue. Double check your scripts and make sure that you don't have that same file opened anywhere else. You can only have 1 instance (via xarray or netcdf Dataset) of that file opened. Any time I try and re-open that file, it gave me an error
OSError: [Errno -101] NetCDF: HDF error: b'EDDI_1999-01-10.nc4'
If you are re-opening the netcdf file,
you should try this
import gc
import netCDF4 as nc
nc.Dataset("abc.nc", "r")
gc.collect()
a = nc.Dataset("abc.nc", "r+")
In case anybody is still looking for a solution to this error, for me it worked removing the 'r+' option of the Dataset( ):
from netCDF4 import Dataset as nc
fname1 = path/to/netcdf
f1 = nc(fname1,'r+', format='NETCDF4')
to
f1 = nc(fname1,format='NETCDF4')