sparse-to-dense.pytorch
sparse-to-dense.pytorch copied to clipboard
ICRA 2018 "Sparse-to-Dense: Depth Prediction from Sparse Depth Samples and a Single Image" (PyTorch Implementation)
Hi! Besides pretrained models with RGBD modality, could you upload the pretrained model with depth-only modality? Thanks in advance.
RMS on **NYU Depth v2** dataset using **Ours-200** gets **0.230** RMS on **KITTI** dataset using **Ours-200** gets **3.851** Why RMS difference is so high on two different datasets (0.230 and...
Hi, there is no pose information in the processed NYU data. Could you share the processed NYU training data with pose information? Many thanks!
Hi, I am experimenting with your work here (which is great btw) and I noticed this repo (PyTorch) does not have a license (as opposed to the original torch version)....
Hi, the download speed (13kb/s ) is too low when I use the command "wget http://datasets.lids.mit.edu/sparse-to-dense/data/nyudepthv2.tar.gz". How can I raise the download speed?
Hi, I have read your implementation and I have a question about your implementation of sparse depth input generation in NYU dataset. You generated the sparse input when the ground...
I would like to know how is the loss calculated for KITTI dataset since depth information comes from LiDAR and is sparse(18k/208k in the paper). Does it only calculate loss...
Hello everyone, I want to implement Monocular SLAM, using only RGB images can I use the same method to implement that...?? From where the sparse depth will come??
When creating the train-loader, every worker is initialized with its ID as a random seed. ```python if not args.evaluate: train_loader = torch.utils.data.DataLoader( train_dataset, batch_size=args.batch_size, shuffle=True, num_workers=args.workers, pin_memory=True, sampler=None, worker_init_fn=lambda work_id:np.random.seed(work_id))...
the function of "scipy.misc" is be used at dataloaders/transforms.py. now, the function of "scipy.misc" has been discarded, i look for the function of "skimage.transform.resize" to replace it. but when i...