DiffusionDepth
DiffusionDepth copied to clipboard
Questions about dataset and results
Hi there!Thanks for the great work!
I have some questions about data sets and training results.
The first is the kitti dataset. I see in the README file that you use the raw portion of the kitti dataset. When I downloaded the raw part, I found that there are many kinds of RAW parts classified by date on the official website. It is very troublesome to click one by one to download. I would like to ask if you can provide some scripts to support one-click download. I clicked for a long time to download the dataset of all the dates, but found that it was about 180 gigabytes. I would like to know what parts of raw copy to kitti data set you used, can I download it specifically?
The second problem is that I have successfully trained the NYU-V2 data set, but the final result is not so good. When the epoch is 20, test_RMSE is about 0.50. I want to know if I made a mistake using the command. Can you provide commands about the NYU data set.
(patch_height 340 --patch_width 512 -loss 1.0L1+1.0L2+1.0*DDIM --epochs 30 --batch_size 16 --max_depth 10.0 --save NAME_TO_SAVE --model_name Diffusion_DCbase_ --backbone_module swin --backbone_name swin_large_naive_l4w722422k --head_specify DDIMDepthEstimate_Swin_ADDHAHI)
Thanks again for your open source sharing!