crownconv360depth
crownconv360depth copied to clipboard
360° Depth Estimation from Multiple Fisheye Images with Origami Crown Representation of Icosahedron (IROS2020)
IcoSweepNet using CrownConv
PyTorch implementation of our IROS 2020 paper 360° Depth Estimation from Multiple Fisheye Images with Origami Crown Representation of Icosahedron. The preprint is available in arXiv.
Publication
Ren Komatsu, Hiromitsu Fujii, Yusuke Tamura, Atsushi Yamashita and Hajime Asama, "360° Depth Estimation from Multiple Fisheye Images with Origami Crown Representation of Icosahedron", Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2020), 2020.
Installation
We recommend you to use conda
to install dependency packages.
First, run the following command to create virtual environment with dependency packages.
conda env create -f environment.yml
# enter virtual env
conda activate crownconv
Next, install PyTorch based on your cuda version. If you are using CUDA 9.2, run the following command:
conda install pytorch torchvision cudatoolkit=9.2 -c pytorch
Also, you need additional library to undistort fisheye images, which is installed by running the following command:
pip install git+git://github.com/matsuren/ocamcalib_undistort.git
Dataset
Please download datasets from Omnidirectional Stereo Dataset.
We use OmniThings
for training and OmniHouse
for evaluation.
:exclamation:Attention:exclamation:
For some reasons, some filenames are inconsistent in OmniThings
.
For instance, the first image is named 00001.png
in cam1
, but, it is named 0001.png
for cam2
, cam3
, and cam4
. So please rename 0001.png
, 0002.png
, and 0003.png
so that they have five-digit numbers.
Training
python train.py $DATASETS/omnithings
Type python train.py -h
to display other available options.
Evaluation
One of the pretrained models is available here.
python evaluation.py $DATASETS/omnihouse checkpoints/checkpoints_{i}.pth --save_depth
Type python train.py -h
to display other available options.
Depth visualization
The depth is estimated on icosahedron. So, you can't see the results as depth map. The easiest thing to visualize is to convert the depths on icosaehdron into equirectangular images, which can be done by executing the following command.
python visualize_depth.py DEPTH_FILE(.npy)