pytorch-vdsr
pytorch-vdsr copied to clipboard
matlab script `generate_train.m` generate h5 file, pytorch load error.
Hello, use your matlab script generate_train.m
generate h5 file, pytorch load error.
Can you guide me? thanks.
Namespace(batchSize=128, clip=0.4, cuda=True, lr=0.1, momentum=0.9, nEpochs=50, pretrained='', resume='', seed=0, start_epoch=1, step=10, threads=8, trainSet='/data2/tmps/vdsr-train-data-291.h5', weight_decay=0.0001)
==> Random seed 6133
==> Loading datasets
==> Building model
==> Setting GPU
===> Setting Optimizer
===> Training
Epoch=1, lr=0.1
Traceback (most recent call last):
File "main.py", line 35, in <module>
todo_train(opt)
File "main.py", line 30, in todo_train
training(opt)
File "/data2/slyx-sr/src/train.py", line 71, in training
loss_curr = train(training_data_loader, optimizer, model, criterion, epoch, opt)
File "/data2/slyx-sr/src/train.py", line 93, in train
for iteration, batch in enumerate(training_data_loader, 1):
File "/home/hejian/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 201, in __next__
return self._process_next_batch(batch)
File "/home/hejian/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 221, in _process_next_batch
raise batch.exc_type(batch.exc_msg)
OSError: Traceback (most recent call last):
File "/home/hejian/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 40, in _worker_loop
samples = collate_fn([dataset[i] for i in batch_indices])
File "/home/hejian/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 40, in <listcomp>
samples = collate_fn([dataset[i] for i in batch_indices])
File "/data2/slyx-sr/src/dataset.py", line 14, in __getitem__
return torch.from_numpy(self.data[index,:,:,:]).float(), torch.from_numpy(self.target[index,:,:,:]).float()
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1496916508360/work/h5py/_objects.c:2846)
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1496916508360/work/h5py/_objects.c:2804)
File "/home/hejian/anaconda3/lib/python3.5/site-packages/h5py/_hl/dataset.py", line 494, in __getitem__
self.id.read(mspace, fspace, arr, mtype, dxpl=self._dxpl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1496916508360/work/h5py/_objects.c:2846)
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/home/ilan/minonda/conda-bld/h5py_1496916508360/work/h5py/_objects.c:2804)
File "h5py/h5d.pyx", line 181, in h5py.h5d.DatasetID.read (/home/ilan/minonda/conda-bld/h5py_1496916508360/work/h5py/h5d.c:3413)
File "h5py/_proxy.pyx", line 130, in h5py._proxy.dset_rw (/home/ilan/minonda/conda-bld/h5py_1496916508360/work/h5py/_proxy.c:2008)
File "h5py/_proxy.pyx", line 84, in h5py._proxy.H5PY_H5Dread (/home/ilan/minonda/conda-bld/h5py_1496916508360/work/h5py/_proxy.c:1656)
OSError: Can't read data (Wrong b-tree signature)
@flystarhe Please set threads=1
@twtygqyy thanks, but ..
hi @flystarhe Did you find a solution for multi-thread loading hdf5 file?