MiDaS icon indicating copy to clipboard operation
MiDaS copied to clipboard

Code for robust monocular depth estimation described in "Ranftl et. al., Towards Robust Monocular Depth Estimation: Mixing Datasets for Zero-shot Cross-dataset Transfer, TPAMI 2022"

Results 160 MiDaS issues
Sort by recently updated
recently updated
newest added

I'm using [this](https://tfhub.dev/intel/lite-model/midas/v2_1_small/1/lite/1) tflite model and running it on my PC using this script ``` import cv2 import tensorflow as tf import urllib.request import matplotlib.pyplot as plt import numpy as...

Hi, I am new to Midas. Can I ask what is the depth range of the predicted depth map. Is this [0,1] ?

Hello guys, Great work! The models are extremely useful! Quick question, is it possible to reduce even further the size of the small model by static quantization? Thank you!

Hello, has anyone successfully converted the small model to caffe2 1.6? I tried PyTorch -> ONNX -> Caffe2 but I have an error: > convert-onnx-to-caffe2 midas.onnx --output midas_predict.pb --init-net-output midas_init.pb...

I would like to use the small model for inferencing in CPU's with the OpenCV DNN module. The larger onnx model works without any problems but the smaller one throws...

good morning, I congratulate you for your work and thank you for sharing your results, I am currently trying to work with the pretrained models in the application of tensorflow...

I want to train the model on my own dataset using your scale and shift invariant loss, but the ground truth depth is sparse (about 40% pixels are valid), will...

The results are accurate on large models, but on the TFLITE model, the results are often a little smaller than the edges of the object. Assuming a sphere with a...

Is it possible to know (or approximate somehow) the normalisation curve of the depth map MiDaS estimates?

In the app module directory, lack of `org.tensorflow.lite.examples.classification.env`. after `./gradlew iD`, cannot find the app in the mobile phone, How to fix it?