DCPDN icon indicating copy to clipboard operation
DCPDN copied to clipboard

hi, can you share the train/val/test raw images, better on BaiDuYun, you know, i can not download from google drive in China. thanks a lot.

Open ghost opened this issue 7 years ago • 9 comments

ghost avatar Jun 02 '18 08:06 ghost

Sorry, baiduyun account is not easy to have without Chinese Phone number. But you can also generate the sample using 'create_train.py' (Please download the NYU-depth @ http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat)

hezhangsprinter avatar Jun 02 '18 22:06 hezhangsprinter

yes, thanks for quick reply!

ghost avatar Jun 03 '18 01:06 ghost

I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available?

3togo avatar Aug 25 '18 08:08 3togo

I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available?

用迅雷

zhaoxin111 avatar Sep 18 '18 09:09 zhaoxin111

will it be too big for baiduyun?下午5:22, 2018年9月18日, Matrix [email protected]: I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available?

用迅雷

—You are receiving this because you commented.Reply to this email directly, view it on GitHub, or mute the thread.

-- Sent from Yandex.Mail for mobile

3togo avatar Sep 18 '18 09:09 3togo

will it be too big for baiduyun?下午5:22, 2018年9月18日, Matrix [email protected]: I can't download from below: http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat any other alternative link available? 用迅雷 —You are receiving this because you commented.Reply to this email directly, view it on GitHub, or mute the thread. -- Sent from Yandex.Mail for mobile

2.8G

zhaoxin111 avatar Sep 18 '18 09:09 zhaoxin111

请问有没有下载成功的,能不能把训练集上传百度云啊,下不下来,总会断。迅雷也不行。

noobgrow avatar Dec 24 '18 15:12 noobgrow

请问有没有下载成功的,能不能把训练集上传百度云啊,下不下来,总会断。迅雷也不行。

download 'nyu_depth_v2_labeled.mat' via wget. The training set is so large (~80GB), generating training and validation set via 'create_train.py' is fast.

shawnyuen avatar Aug 21 '19 15:08 shawnyuen

Does anybody run the code successfully? I am trying to run it, but I encountere a lot of errors. Can anyboy help me? 有人弄这个弄成功了的么?可以分享一下经验么。

AdamJupiter avatar Mar 24 '20 15:03 AdamJupiter