pytorch-superpoint
pytorch-superpoint copied to clipboard
error when i use the download lable magicpoint_synth20_homoAdapt100_coco_f1
load labels from: output/magicpoint_synth20_homoAdapt100_coco_f1/predictions/val
Traceback (most recent call last):
File "train4.py", line 144, in
it seems like the data loader in torch is raising the exception when the Dataloader is trying to use RandomSampler without init the num_samples argument
Relate to this issue on PyTorch repo: https://github.com/pytorch/pytorch/pull/74804
Make sure that the Path you put in the field "labels" in the yaml config is correctly defined. for me is
data:
# name: 'coco'
dataset: 'Coco' # 'coco'
labels: logs/magicpoint_synth_homoAdapt_coco/predictions
and now it works.
think you very much
load labels from: output/magicpoint_synth20_homoAdapt100_coco_f1/predictions/val Traceback (most recent call last): File "train4.py", line 144, in args.func(config, output_dir, args) File "train4.py", line 71, in train_joint data = dataLoader(config, dataset=task, warp_input=True) File "E:\xiefei\pytorch-superpoint-master\utils\loader.py", line 86, in dataLoader worker_init_fn=worker_init_fn File "E:\tensorflow\anaconda3\envs\pytorch\lib\site-packages\torch\utils\data\dataloader.py", line 268, in init sampler = RandomSampler(dataset, generator=generator) File "E:\tensorflow\anaconda3\envs\pytorch\lib\site-packages\torch\utils\data\sampler.py", line 103, in init "value, but got num_samples={}".format(self.num_samples)) ValueError: num_samples should be a positive integer value, but got num_samples=0
Hello, could you please tell me where you downloaded the pseudo ground truth labels? I can't open the link provided on the README: https://drive.google.com/drive/folders/1nnn0UbNMFF45nov90PJNnubDyinm2f26?usp=sharing