HorizonNet
HorizonNet copied to clipboard
How to estimate room dimensions from reconstructed layout?
@sunset1995, thank you for the great work.
Is it possible to estimate the dimensions of the room, either from panorama or reconstruction, using known camera parameters, such as camera height or focal length? I would like to know how HorizonNet can recover the actual dimensions of the room in terms of height x width of the walls, for example. Thank you.
Please check layout_viewer.py which reads the HorizonNet prediction via inference.py on image coordinate and then convert layout corners to 3D space.
Line 207 contains layouts' xy coordinates.
Variables floor_z and ceil_z in line 206 and 210 contains the ceiling/floor height sharing by all xy coordinates.
Please note that the all xyz are up to a scale which means you can multiply all numbers by a constant.
@sunset1995, thank you for the prompt reply. This is all great information. I'm trying to understand the equations on Lines 207-210. Could you share the descriptions of those equations? Are they documented in your paper or appendix?
cor_id is the positions of corners on the panorama image which follow below format:
x_0 y_ceiling_0
x_0 y_floor_0
x_1 y_ceiling_1
x_1 y_floor_1
...
So cor_id[1::2] gives you all the floor corners.
np_coor2xy projects corner on images onto a imaginary floor plane whose distance to camera is floor_z in 3D space.
With all the XYZ 3D position of all the floor corners, the corresponding ceiling corners shared the same XY 3D position as floor corner but with different Z position.
So lines 208-210 solve the remain unknown ceiling Z position (in 3D space) with known cor_id[0::2] (ceiling corners on image) and their corresponding XY coordinates in 3D.
Thank you @sunset1995