hdf5
hdf5 copied to clipboard
H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
Creating a large number of small datasets fails with the assertion:
H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
The error occurs with the current development branch under Debian 12 x86_64 with GCC 13.2.0.
A reproducer is shown below:
#include "hdf5.h"
int main()
{
hid_t file = H5Fcreate("why.h5", H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
hsize_t dims[3] = {18, 18, 15};
hid_t fspace = H5Screate_simple(3, dims, NULL);
float data[18][18][15];
for (size_t i = 0; i < 18; i++)
{
for (size_t j = 0; j < 18; j++)
{
for (size_t k = 0; k < 15; k++)
{
data[i][j][k] = (float) (i * j * k);
}
}
}
#ifdef USE_SUB_GROUP
hid_t group = H5Gcreate(file, "group", H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
#endif
char name[20];
/*
* 1441790 -> file created successfully, h5stat is happy
*
* 1441791 -> file created successfully, h5stat barfs:
* h5stat: H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
*
* 1441792 -> dataset creation fails with:
* a.out: H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
*/
for (size_t i = 0; i < 1441791; ++i)
{
sprintf(name, "data%06lu", i);
// Using a subgroup doesn't save our bacon. Same error behavior.
#ifdef USE_SUB_GROUP
hid_t dset = H5Dcreate(group, name, H5T_NATIVE_FLOAT, fspace, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
#else
hid_t dset = H5Dcreate(file, name, H5T_NATIVE_FLOAT, fspace, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
#endif
H5Dwrite(dset, H5T_NATIVE_FLOAT, H5S_ALL, H5S_ALL, H5P_DEFAULT, data);
H5Dclose(dset);
}
H5Sclose(fspace);
#ifdef USE_SUB_GROUP
H5Gclose(group);
#endif
H5Fclose(file);
return 0;
}
Hi Gerd,
Can you try to increase the size of H5C_MAX_ENTRY_SIZE in H5Cprivate.h and also try the latest format with the initial value?
Elena
On Tue, Oct 24, 2023 at 8:42 AM Gerd Heber @.***> wrote:
Creating a large number of datasets fails with the assertion:
H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed.
The error occurs with the current development branch under Debian 12 x86_64 with GCC 13.2.0.
A reproducer is shown below:
#include "hdf5.h"
int main() { hid_t file = H5Fcreate("why.h5", H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT); hsize_t dims[3] = {18, 18, 15}; hid_t fspace = H5Screate_simple(3, dims, NULL);
float data[18][18][15]; for (size_t i = 0; i < 18; i++) { for (size_t j = 0; j < 18; j++) { for (size_t k = 0; k < 15; k++) { data[i][j][k] = (float) (i * j * k); } } }
#ifdef USE_SUB_GROUP hid_t group = H5Gcreate(file, "group", H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT); #endif
char name[20]; /* * 1441790 -> file created successfully, h5stat is happy * * 1441791 -> file created successfully, h5stat barfs: * h5stat: H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed. * * 1441792 -> dataset creation fails with: * a.out: H5Centry.c:1232: H5C__load_entry: Assertion `entry->size < H5C_MAX_ENTRY_SIZE' failed. */ for (size_t i = 0; i < 1441791; ++i) { sprintf(name, "data%06lu", i); // Using a subgroup doesn't save our bacon. Same error behavior.
#ifdef USE_SUB_GROUP hid_t dset = H5Dcreate(group, name, H5T_NATIVE_FLOAT, fspace, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT); #else hid_t dset = H5Dcreate(file, name, H5T_NATIVE_FLOAT, fspace, H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT); #endif H5Dwrite(dset, H5T_NATIVE_FLOAT, H5S_ALL, H5S_ALL, H5P_DEFAULT, data); H5Dclose(dset); } H5Sclose(fspace);
#ifdef USE_SUB_GROUP H5Gclose(group); #endif
H5Fclose(file); return 0;
}
— Reply to this email directly, view it on GitHub https://github.com/HDFGroup/hdf5/issues/3762, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADLFT3JXYO5XJZKGUY25J3TYA7OVDAVCNFSM6AAAAAA6N5E346VHI2DSMVQWIX3LMV43ASLTON2WKOZRHE2TSNJVGM4TQNA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Good points!
Can you try to increase the size of H5C_MAX_ENTRY_SIZE in H5Cprivate.h
That works, but it is cheating.
try the latest format with the initial value?
H5Pset_libver_bounds(fapl, H5F_LIBVER_V110, H5F_LIBVER_V110);
This works (with the initial value), but it's still covering up a deeper problem.
And for this "use case," there is also a substantial performance regression.
Well... I am not sure if it is cheating... Metadata cache size has a limit (32MB) that can be increased by changing the value and recompiling the library.
Earliest format allows local heaps to grow, and apparently, at some point a metadata item (local heap) becomes bigger than 32 MBs. New file format uses a fractal heap to prevent such a situation. I think developers will have a better insight.
Elena
On Wed, Oct 25, 2023 at 6:48 AM Gerd Heber @.***> wrote:
Good points!
Can you try to increase the size of H5C_MAX_ENTRY_SIZE in H5Cprivate.h
That works, but it is cheating.
try the latest format with the initial value?
H5Pset_libver_bounds(fapl, H5F_LIBVER_V110, H5F_LIBVER_V110);
This works (with the initial value), but it's still covering up a deeper problem.
— Reply to this email directly, view it on GitHub https://github.com/HDFGroup/hdf5/issues/3762#issuecomment-1779318120, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADLFT3KAPKGBILU43RJZNOTYBEKB5AVCNFSM6AAAAAA6N5E346VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONZZGMYTQMJSGA . You are receiving this because you commented.Message ID: @.***>
We've added a fix for the cache assert in 1.14.4 and will fix the underlying heap issue in 1.14.5.