PENet_ICRA2021 icon indicating copy to clipboard operation
PENet_ICRA2021 copied to clipboard

How to convert .bin lidar files to .png

Open ar99-dev opened this issue 3 years ago • 13 comments

I want to infer using your model on kitti .bin files but your model take .png files. So if it's possible to provide the code for this conversion. Thanks in advance.

ar99-dev avatar Jun 16 '21 01:06 ar99-dev

Thanks for your interest! In this project we don't directly take .bin files as inputs. However, I have a naive and incomplete python script for converting the .bin files to .png via direct projection. Note that the projected depth maps from LiDaR pointclouds are not exactly the same as those in KITTI depth completion dataset as Uhrig et al. [3DV2017] composes ego motion compensation. I hope this script may help. bin2png.txt

JUGGHM avatar Jun 16 '21 02:06 JUGGHM

hi, @JUGGHM I tried the code what you gave for converting .bin to .png format, but I have some doubt like, why still it need to read depth from .png because this is the code to read depth .bin file and projecting in-camera coordinates and save that projection in .png file right? I would like to convert .bin to .png for depth completion in the custom dataset. Thank you very much

Laihu08 avatar Jul 19 '21 03:07 Laihu08

hi, @JUGGHM I tried the code what you gave for converting .bin to .png format, but I have some doubt like, why still it need to read depth from .png because this is the code to read depth .bin file and projecting in-camera coordinates and save that projection in .png file right? I would like to convert .bin to .png for depth completion in the custom dataset. Thank you very much

I just used the depth file to ensure whether the projected result is right, and you could read no depth maps actually. I didn't involve it in PENet but I think you could modify this incomplete script for your own use. The pro_depth variable is the projected sparse depth in numpy format and I think it is what you want. And you might initialize it manually but not from #pro_depth = np.zeros_like(dc_depth).

JUGGHM avatar Jul 19 '21 11:07 JUGGHM

Yes Exactly, because currently I am having .npy format of depth map for 3D object detection. Now can i use this .npy file in your code to estimate dense depth map and i have respective Images for those depth also but i am little confusion about calibration file because the calibration file of depth completion datasets is different from 3D object detection ! can you help me to solve this problem ?

Laihu08 avatar Jul 19 '21 11:07 Laihu08

Yes Exactly, because currently I am having .npy format of depth map for 3D object detection. Now can i use this .npy file in your code to estimate dense depth map and i have respective Images for those depth also but i am little confusion about calibration file because the calibration file of depth completion datasets is different from 3D object detection ! can you help me to solve this problem ?

In my script I used the calibration matrix of KITTI raw dataset, and I am not quite familiar with 3D object detection task.

JUGGHM avatar Jul 19 '21 12:07 JUGGHM

oh okay thank you very much for your quick replies. can we change hyper parameter values in Loss functions while training ? because it mentioned (0,0,0) I am just curious will it work if i change it ?

Laihu08 avatar Jul 19 '21 12:07 Laihu08

oh okay thank you very much for your quick replies. can we change hyper parameter values in Loss functions while training ? because it mentioned (0,0,0) I am just curious will it work if i change it ?

It will work but the performance will change I think. It corresponds to the description of intermediate supervision in the implementation details in the paper.

JUGGHM avatar Jul 19 '21 12:07 JUGGHM

oh okay thank you very much for your quick replies. can we change hyper parameter values in Loss functions while training ? because it mentioned (0,0,0) I am just curious will it work if i change it ?

I reviewed the code but found only loss items with 0, 0, 0 initialization. It is just for creating the variables.

JUGGHM avatar Jul 19 '21 12:07 JUGGHM

oh maybe i maybe misunderstood the logic over there, then what value you assigned as hyper parameter for your loss functions ? Thank you!

Laihu08 avatar Jul 19 '21 12:07 Laihu08

oh maybe i maybe misunderstood the logic over there, then what value you assigned as hyper parameter for your loss functions ? Thank you!

w_st1 and w_st2

JUGGHM avatar Jul 19 '21 12:07 JUGGHM

Can you explain more about this hyper parameter ? Thank you

Laihu08 avatar Jul 19 '21 13:07 Laihu08

oh okay thank you very much for your quick replies. can we change hyper parameter values in Loss functions while training ? because it mentioned (0,0,0) I am just curious will it work if i change it ?

It will work but the performance will change I think. It corresponds to the description of intermediate supervision in the implementation details in the paper.

JUGGHM avatar Jul 19 '21 13:07 JUGGHM

@Laihu08 Have you solved this problem using the author's script? I want to convert KITTI object dataset into depth too.

Senwang98 avatar Jun 22 '22 15:06 Senwang98