Basset
Basset copied to clipboard
ENUM error when running basset_train.lua
Hi Dave,
When I run basset_train.lua -job pretrained_params.txt -stagnant_t 10 er.h5, I end up getting an ENUM(50331977) error. I have tried diagnosing it, but haven't had any luck. Below are the error logs:
{ conv_filter_sizes : { 1 : 19 2 : 11 3 : 7 } weight_norm : 7 momentum : 0.98 learning_rate : 0.002 hidden_units : { 1 : 1000 2 : 1000 } conv_filters : { 1 : 300 2 : 200 3 : 200 } hidden_dropouts : { 1 : 0.3 2 : 0.3 } pool_width : { 1 : 3 2 : 4 3 : 4 } } seq_len: 600, filter_size: 19, pad_width: 18 seq_len: 200, filter_size: 11, pad_width: 10 seq_len: 50, filter_size: 7, pad_width: 6 nn.Sequential { [input -> (1) -> (2) -> (3) -> (4) -> (5) -> (6) -> (7) -> (8) -> (9) -> (10) -> (11) -> (12) -> (13) -> (14) -> (15) -> (16) -> (17) -> (18) -> (19) -> (20) -> (21) -> (22) -> (23) -> output] (1): nn.SpatialConvolution(4 -> 300, 19x1, 1,1, 9,0) (2): nn.SpatialBatchNormalization (4D) (300) (3): nn.ReLU (4): nn.SpatialMaxPooling(3x1, 3,1) (5): nn.SpatialConvolution(300 -> 200, 11x1, 1,1, 5,0) (6): nn.SpatialBatchNormalization (4D) (200) (7): nn.ReLU (8): nn.SpatialMaxPooling(4x1, 4,1) (9): nn.SpatialConvolution(200 -> 200, 7x1, 1,1, 3,0) (10): nn.SpatialBatchNormalization (4D) (200) (11): nn.ReLU (12): nn.SpatialMaxPooling(4x1, 4,1) (13): nn.Reshape(2600) (14): nn.Linear(2600 -> 1000) (15): nn.BatchNormalization (2D) (1000) (16): nn.ReLU (17): nn.Dropout(0.300000) (18): nn.Linear(1000 -> 1000) (19): nn.BatchNormalization (2D) (1000) (20): nn.ReLU (21): nn.Dropout(0.300000) (22): nn.Linear(1000 -> 164) (23): nn.Sigmoid } /home/hugheslab2/zainmunirpatel/torch/install/bin/luajit: .../zainmunirpatel/torch/install/share/lua/5.1/hdf5/ffi.lua:335: Reading data of class ENUM(50331977) is unsupported stack traceback: [C]: in function 'error' .../zainmunirpatel/torch/install/share/lua/5.1/hdf5/ffi.lua:335: in function '_getTorchType' ...nmunirpatel/torch/install/share/lua/5.1/hdf5/dataset.lua:88: in function 'getTensorFactory' ...nmunirpatel/torch/install/share/lua/5.1/hdf5/dataset.lua:138: in function 'partial' /home/hugheslab2/zainmunirpatel/Basset/src/batcher.lua:39: in function 'next' /home/hugheslab2/zainmunirpatel/Basset/src/convnet.lua:1009: in function 'train_epoch' ...e/hugheslab2/zainmunirpatel/Basset//src/basset_train.lua:156: in main chunk [C]: in function 'dofile' ...atel/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk [C]: at 0x00406460 Epoch # 1 [zainmunirpatel@bc2 data]$
Thanks for the help!
Best, Zain
Sounds like an HDF5 compatibility issue. Check out this previous issue https://github.com/davek44/Basset/issues/40. Unfortunately, you’ll need to use HDF5 version 1.8 as discussed there.
Hi Dave,
Thanks for the response.
I already had HDF5 1.8 installed. To double check I ran h5dump --version and I got h5dump: Version 1.8.12. However, when I start up torch and run require 'hdf5' it gives me something similar to issue https://github.com/davek44/Basset/issues/25#issuecomment-256932374. I have attached what I get:
th> require 'hdf5' { H5Z_FILTER_CONFIG_ENCODE_ENABLED : 1 H5F_ACC_RDWR : 1 _getTorchType : function: 0x41b2fd18 H5F_OBJ_FILE : 1 H5S_ALL : 0 H5F_OBJ_GROUP : 4 C : userdata: 0x403b21f0 H5P_DEFAULT : 0 _describeObject : function: 0x41dc5718 H5Z_FILTER_NBIT : 5 _debugMode : false _getObjectType : function: 0x41dc56d8 H5F_OBJ_ALL : 31 _getObjectName : function: 0x41b2fd38 version : { 1 : 1 2 : 8 3 : 12 } H5Z_FILTER_SHUFFLE : 2 HDF5Group : {...} open : function: 0x402b8c50 H5Z_FILTER_SZIP : 4 H5F_OBJ_ATTR : 16 H5Z_FILTER_FLETCHER32 : 3 H5F_OBJ_DATATYPE : 8 debugMode : function: 0x402b8bd0 H5F_ACC_EXCL : 4 H5Z_FILTER_NONE : 0 _testUtils : { deepAlmostEq : function: 0x402b8b90 withTmpDir : function: 0x402b8af0 } _nativeTypeForTensorType : function: 0x41b2fcb0 H5F_ACC_TRUNC : 2 _config : { HDF5_INCLUDE_PATH : "/usr/include" HDF5_LIBRARIES : "/usr/lib64/libhdf5.so;/usr/lib64/libsz.so;/usr/lib64/libz.so;/usr/lib64/libdl.so;/usr/lib64/libm.so" } _loadObject : function: 0x402b8c30 DataSetOptions : {...} HDF5DataSet : {...} H5Z_FILTER_RESERVED : 256 _logger : { error : function: 0x41b86850 warn : function: 0x41b86850 debug : function: 0x41b86898 } H5F_ACC_RDONLY : 0 _fletcher32Available : function: 0x41dc5738 H5Z_FILTER_CONFIG_DECODE_ENABLED : 2 ffi : { abi : function: builtin#202 copy : function: builtin#200 errno : function: builtin#198 typeinfo : function: builtin#193 alignof : function: builtin#196 cdef : function: builtin#189 C : userdata: 0x41b8b398 cast : function: builtin#191 load : function: builtin#205 offsetof : function: builtin#197 sizeof : function: builtin#195 string : function: builtin#199 metatype : function: builtin#203 new : function: builtin#190 arch : "x64" os : "Linux" gc : function: builtin#204 fill : function: builtin#201 istype : function: builtin#194 typeof : function: builtin#192 } H5Z_FILTER_ERROR : -1 _outputTypeForTensorType : function: 0x41b8df30 H5Z_FILTER_MAX : 65535 h5t : { STD_B32LE : 50331728 IEEE_F32BE : 50331703 NATIVE_B16 : 50331694 NATIVE_HSIZE : 50331698 NO_CLASS : -1 NATIVE_UINT_FAST32 : 50331681 STD_B8LE : 50331724 STD_I64BE : 50331715 NATIVE_LONG : 50331662 NATIVE_DOUBLE : 50331691 TIME : 2 NATIVE_INT16 : 50331670 NATIVE_LLONG : 50331688 STD_I8BE : 50331709 STD_B64LE : 50331730 NATIVE_HSSIZE : 50331699 STD_B32BE : 50331729 INTEGER : 0 NATIVE_INT64 : 50331682 STD_I8LE : 50331708 STD_I32LE : 50331712 NATIVE_SCHAR : 50331656 NATIVE_INT_FAST16 : 50331674 NATIVE_INT : 50331660 BITFIELD : 4 NATIVE_UINT_LEAST16 : 50331673 NATIVE_INT_LEAST16 : 50331672 IEEE_F64LE : 50331704 NATIVE_INT_FAST64 : 50331686 NATIVE_UINT_FAST64 : 50331687 NATIVE_UINT_LEAST64 : 50331685 NATIVE_INT_LEAST64 : 50331684 ENUM : 8 NATIVE_UINT64 : 50331683 NATIVE_HBOOL : 50331701 NATIVE_ULONG : 50331663 NATIVE_INT_FAST32 : 50331680 NATIVE_HADDR : 50331697 NATIVE_UINT : 50331661 NCLASSES : 11 NATIVE_UINT_LEAST32 : 50331679 NATIVE_INT_LEAST32 : 50331678 STD_U64LE : 50331722 NATIVE_UINT32 : 50331677 NATIVE_SHORT : 50331658 NATIVE_INT32 : 50331676 VLEN : 9 ARRAY : 10 STD_U16LE : 50331718 STD_B16LE : 50331726 STD_I64LE : 50331714 NATIVE_UINT16 : 50331671 NATIVE_UINT_FAST8 : 50331669 STD_B16BE : 50331727 NATIVE_INT_FAST8 : 50331668 FLOAT : 1 REFERENCE : 7 STD_U32LE : 50331720 NATIVE_USHORT : 50331659 NATIVE_ULLONG : 50331689 NATIVE_INT8 : 50331664 IEEE_F32LE : 50331702 STD_U8BE : 50331717 NATIVE_INT_LEAST8 : 50331666 NATIVE_UINT8 : 50331665 NATIVE_B32 : 50331695 NATIVE_HERR : 50331700 NATIVE_OPAQUE : 50331736 NATIVE_LDOUBLE : 50331692 NATIVE_UINT_LEAST8 : 50331667 COMPOUND : 6 STD_REF_OBJ : 50331739 NATIVE_UINT_FAST16 : 50331675 NATIVE_B64 : 50331696 STD_U16BE : 50331719 STD_REF_DSETREG : 50331740 NATIVE_B8 : 50331693 STD_I32BE : 50331713 IEEE_F64BE : 50331705 NATIVE_FLOAT : 50331690 NATIVE_UCHAR : 50331657 STD_U32BE : 50331721 OPAQUE : 5 STD_B64BE : 50331731 STD_U8LE : 50331716 STD_I16BE : 50331711 STD_B8BE : 50331725 STRING : 3 STD_I16LE : 50331710 STD_U64BE : 50331723 } HDF5File : {...} H5Z_FILTER_SCALEOFFSET : 6 H5Z_FILTER_DEFLATE : 1 H5F_ACC_CREAT : 16 H5F_UNLIMITED : 18446744073709551615ULL _deflateAvailable : function: 0x41dc5780 _inDebugMode : function: 0x402b8c10 H5F_OBJ_LOCAL : 32 H5S_SELECT_SET : 0 _datatypeName : function: 0x41541360 H5F_ACC_DEBUG : 8 H5F_OBJ_DATASET : 2 } [1.3558s] th> require 'HDF5' ...ainmunirpatel/torch/install/share/lua/5.1/trepl/init.lua:389: module 'HDF5' not found:No LuaRocks module found for HDF5 no field package.preload['HDF5'] no file '/home/hugheslab2/zainmunirpatel/.luarocks/share/lua/5.1/HDF5.lua' no file '/home/hugheslab2/zainmunirpatel/.luarocks/share/lua/5.1/HDF5/init.lua' no file '/home/hugheslab2/zainmunirpatel/torch/install/share/lua/5.1/HDF5.lua' no file '/home/hugheslab2/zainmunirpatel/torch/install/share/lua/5.1/HDF5/init.lua' no file '/home/hugheslab2/zainmunirpatel/Basset/src/HDF5.lua' no file './HDF5.lua' no file '/home/hugheslab2/zainmunirpatel/torch/install/share/luajit-2.1.0-beta1/HDF5.lua' no file '/usr/local/share/lua/5.1/HDF5.lua' no file '/usr/local/share/lua/5.1/HDF5/init.lua' no file '/home/hugheslab2/zainmunirpatel/.luarocks/lib/lua/5.1/HDF5.so' no file '/home/hugheslab2/zainmunirpatel/torch/install/lib/lua/5.1/HDF5.so' no file '/home/hugheslab2/zainmunirpatel/torch/install/lib/HDF5.so' no file './HDF5.so' no file '/usr/local/lib/lua/5.1/HDF5.so' no file '/usr/local/lib/lua/5.1/loadall.so' stack traceback: [C]: in function 'error' ...ainmunirpatel/torch/install/share/lua/5.1/trepl/init.lua:389: in function 'require' [string "_RESULT={require 'HDF5'}"]:1: in main chunk [C]: in function 'xpcall' ...ainmunirpatel/torch/install/share/lua/5.1/trepl/init.lua:661: in function 'repl' ...atel/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:204: in main chunk [C]: at 0x00406460 [0.0105s] th> require 'hdf5' { H5Z_FILTER_CONFIG_ENCODE_ENABLED : 1 H5F_ACC_RDWR : 1 _getTorchType : function: 0x41b2fd18 H5F_OBJ_FILE : 1 H5S_ALL : 0 H5F_OBJ_GROUP : 4 C : userdata: 0x403b21f0 H5P_DEFAULT : 0 _describeObject : function: 0x41dc5718 H5Z_FILTER_NBIT : 5 _debugMode : false _getObjectType : function: 0x41dc56d8 H5F_OBJ_ALL : 31 _getObjectName : function: 0x41b2fd38 version : { 1 : 1 2 : 8 3 : 12 } H5Z_FILTER_SHUFFLE : 2 HDF5Group : {...} open : function: 0x402b8c50 H5Z_FILTER_SZIP : 4 H5F_OBJ_ATTR : 16 H5Z_FILTER_FLETCHER32 : 3 H5F_OBJ_DATATYPE : 8 debugMode : function: 0x402b8bd0 H5F_ACC_EXCL : 4 H5Z_FILTER_NONE : 0 _testUtils : { deepAlmostEq : function: 0x402b8b90 withTmpDir : function: 0x402b8af0 } _nativeTypeForTensorType : function: 0x41b2fcb0 H5F_ACC_TRUNC : 2 _config : { HDF5_INCLUDE_PATH : "/usr/include" HDF5_LIBRARIES : "/usr/lib64/libhdf5.so;/usr/lib64/libsz.so;/usr/lib64/libz.so;/usr/lib64/libdl.so;/usr/lib64/libm.so" } _loadObject : function: 0x402b8c30 DataSetOptions : {...} HDF5DataSet : {...} H5Z_FILTER_RESERVED : 256 _logger : { error : function: 0x41b86850 warn : function: 0x41b86850 debug : function: 0x41b86898 } H5F_ACC_RDONLY : 0 _fletcher32Available : function: 0x41dc5738 H5Z_FILTER_CONFIG_DECODE_ENABLED : 2 ffi : { abi : function: builtin#202 copy : function: builtin#200 errno : function: builtin#198 typeinfo : function: builtin#193 alignof : function: builtin#196 cdef : function: builtin#189 C : userdata: 0x41b8b398 cast : function: builtin#191 load : function: builtin#205 offsetof : function: builtin#197 sizeof : function: builtin#195 string : function: builtin#199 metatype : function: builtin#203 new : function: builtin#190 arch : "x64" os : "Linux" gc : function: builtin#204 fill : function: builtin#201 istype : function: builtin#194 typeof : function: builtin#192 } H5Z_FILTER_ERROR : -1 _outputTypeForTensorType : function: 0x41b8df30 H5Z_FILTER_MAX : 65535 h5t : { STD_B32LE : 50331728 IEEE_F32BE : 50331703 NATIVE_B16 : 50331694 NATIVE_HSIZE : 50331698 NO_CLASS : -1 NATIVE_UINT_FAST32 : 50331681 STD_B8LE : 50331724 STD_I64BE : 50331715 NATIVE_LONG : 50331662 NATIVE_DOUBLE : 50331691 TIME : 2 NATIVE_INT16 : 50331670 NATIVE_LLONG : 50331688 STD_I8BE : 50331709 STD_B64LE : 50331730 NATIVE_HSSIZE : 50331699 STD_B32BE : 50331729 INTEGER : 0 NATIVE_INT64 : 50331682 STD_I8LE : 50331708 STD_I32LE : 50331712 NATIVE_SCHAR : 50331656 NATIVE_INT_FAST16 : 50331674 NATIVE_INT : 50331660 BITFIELD : 4 NATIVE_UINT_LEAST16 : 50331673 NATIVE_INT_LEAST16 : 50331672 IEEE_F64LE : 50331704 NATIVE_INT_FAST64 : 50331686 NATIVE_UINT_FAST64 : 50331687 NATIVE_UINT_LEAST64 : 50331685 NATIVE_INT_LEAST64 : 50331684 ENUM : 8 NATIVE_UINT64 : 50331683 NATIVE_HBOOL : 50331701 NATIVE_ULONG : 50331663 NATIVE_INT_FAST32 : 50331680 NATIVE_HADDR : 50331697 NATIVE_UINT : 50331661 NCLASSES : 11 NATIVE_UINT_LEAST32 : 50331679 NATIVE_INT_LEAST32 : 50331678 STD_U64LE : 50331722 NATIVE_UINT32 : 50331677 NATIVE_SHORT : 50331658 NATIVE_INT32 : 50331676 VLEN : 9 ARRAY : 10 STD_U16LE : 50331718 STD_B16LE : 50331726 STD_I64LE : 50331714 NATIVE_UINT16 : 50331671 NATIVE_UINT_FAST8 : 50331669 STD_B16BE : 50331727 NATIVE_INT_FAST8 : 50331668 FLOAT : 1 REFERENCE : 7 STD_U32LE : 50331720 NATIVE_USHORT : 50331659 NATIVE_ULLONG : 50331689 NATIVE_INT8 : 50331664 IEEE_F32LE : 50331702 STD_U8BE : 50331717 NATIVE_INT_LEAST8 : 50331666 NATIVE_UINT8 : 50331665 NATIVE_B32 : 50331695 NATIVE_HERR : 50331700 NATIVE_OPAQUE : 50331736 NATIVE_LDOUBLE : 50331692 NATIVE_UINT_LEAST8 : 50331667 COMPOUND : 6 STD_REF_OBJ : 50331739 NATIVE_UINT_FAST16 : 50331675 NATIVE_B64 : 50331696 STD_U16BE : 50331719 STD_REF_DSETREG : 50331740 NATIVE_B8 : 50331693 STD_I32BE : 50331713 IEEE_F64BE : 50331705 NATIVE_FLOAT : 50331690 NATIVE_UCHAR : 50331657 STD_U32BE : 50331721 OPAQUE : 5 STD_B64BE : 50331731 STD_U8LE : 50331716 STD_I16BE : 50331711 STD_B8BE : 50331725 STRING : 3 STD_I16LE : 50331710 STD_U64BE : 50331723 } HDF5File : {...} H5Z_FILTER_SCALEOFFSET : 6 H5Z_FILTER_DEFLATE : 1 H5F_ACC_CREAT : 16 H5F_UNLIMITED : 18446744073709551615ULL _deflateAvailable : function: 0x41dc5780 _inDebugMode : function: 0x402b8c10 H5F_OBJ_LOCAL : 32 H5S_SELECT_SET : 0 _datatypeName : function: 0x41541360 H5F_ACC_DEBUG : 8 H5F_OBJ_DATASET : 2 }
This made me think that I might have not installed https://github.com/davek44/torch-hdf5.git properly. I re-installed it but still get the same ENUM error when I run basset_train.lua. Here are my installation logs for installing torch-hdf5. Was wondering if it should like this for a successful install for torch-hdf5?
[zainmunirpatel@bc2 torch-hdf5]$ luarocks make cmake -E make_directory build; cd build; cmake .. -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH="/home/hugheslab2/zainmunirpatel/torch/install/bin/.." -DCMAKE_INSTALL_PREFIX="/home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0"; make
-- The C compiler identification is GNU 4.8.5 -- The CXX compiler identification is GNU 4.8.5 -- Check for working C compiler: /usr/bin/cc -- Check for working C compiler: /usr/bin/cc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ -- Check for working CXX compiler: /usr/bin/c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Found Torch7 in /home/hugheslab2/zainmunirpatel/torch/install -- Found HDF5: /usr/lib64/libhdf5.so;/usr/lib64/libsz.so;/usr/lib64/libz.so;/usr/lib64/libdl.so;/usr/lib64/libm.so (Required is at least version "1.8") -- Configuring done -- Generating done -- Build files have been written to: /home/hugheslab2/zainmunirpatel/Basset/src/torch-hdf5/build cd build && make install Install the project... -- Install configuration: "Release" -- Generating /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/config.lua -- Installing: /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/ffi.lua -- Installing: /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/dataset.lua -- Installing: /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/group.lua -- Installing: /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/testUtils.lua -- Installing: /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/file.lua -- Installing: /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/datasetOptions.lua -- Installing: /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks/hdf5/0-0/lua/hdf5/init.lua Updating manifest for /home/hugheslab2/zainmunirpatel/torch/install/lib/luarocks/rocks hdf5 0-0 is now built and installed in /home/hugheslab2/zainmunirpatel/torch/install/ (license: BSD)
Thanks for the help and let me know if you think there is some other issue.
Best, Zain
The last thing that I can think of is that you also have deepmind’s version of torch-hdf5 installed just like Sam in https://github.com/davek44/Basset/issues/25. Maybe uninstall torch-hdf5 to make sure you have a clean slate before installing my version.
Otherwise, I’m not sure I’m going to be able to figure this one out. So, I prioritize Basset-like behavior in the Basenji package to give you a path forward.
Hi, I have Basset-like behavior implemented in Basenji now. Check out https://github.com/calico/basenji/tree/master/manuscripts/basset
Hi Dave, Thanks a lot for implementing this. I really appreciate it! I will check it out. Best, Zain
Hi Dave, Thanks a lot for implementing this. I really appreciate it! I will check it out. Best, Zain
Hi,I have the nearly the same problem with you,but I can't figure out what's going wrong.I reinstall the hdf5 1.8.18. But I still receive ENUM error. Any idea will be appreciated!
Hi Dave, Thanks a lot for implementing this. I really appreciate it! I will check it out. Best, Zain
Hi,I have the nearly the same problem with you,but I can't figure out what's going wrong.I reinstall the hdf5 1.8.18. But I still receive ENUM error. Any idea will be appreciated!
The error comes from your .h5 file Before running "seq_hdf5.py" make sure your h5py version is correct. RUN: conda install -c conda-forge h5py==2.6.0 everything will be OK! update: the src files are different from which deposited in docker