ObjectDatasetTools icon indicating copy to clipboard operation
ObjectDatasetTools copied to clipboard

To check ground truth and mesh generated

Open sanjaysswami opened this issue 6 years ago • 10 comments

@F2Wang I was trying to project back the ground truth generated but somehow it is not correct. Can you please tell me how I can project the ground truth back onto the RGB image and check it is correctly generated or not? Or if you have any other idea to cross check can you please tell me?

I am trying to generate real data set and use it to train Dense fusion.

sanjaysswami avatar Sep 26 '19 11:09 sanjaysswami

Hi this should be automatically generated after running createlabelfiles, you can inspect it by running inspectMasks.py

F2Wang avatar Sep 26 '19 15:09 F2Wang

@F2Wang

  1. I tried inspect.py and it is working. But the it is using only mask and RGB images. I wanted to check if the ground truth files generated as transform.npy is correct or not. I am using cv2.projectPoints(U, V, X, Y, Z), where U: model points from .ply file, V: rotational 33 matrix, X: transnational 13 matrix, Y: camera intrinsic parameters 33 and Z: distortion matrix 15 if any. Then I project the image_points generated from this function back onto RGB image. Unfortunately it is working only for starting couple of frames but not for remaining. I really want to check how do I can check if it is correct or not.

  2. Can I use my own ply file for my object instead. And use this ply file to generate mask and everything else?

sanjaysswami avatar Oct 01 '19 14:10 sanjaysswami

@sanjaysswami can this repo create the Rotation and transnational matrix?

YanqingWu avatar Oct 08 '19 02:10 YanqingWu

@sanjaysswami can this repo create the Rotation and transnational matrix?

YES. The ground truth pose is always in terms of rotational and translation matrices

sanjaysswami avatar Oct 08 '19 08:10 sanjaysswami

@sanjaysswami But the value of the rotational and translation matrices in this repo is not equal to the PNP results. I tried to solvePnP using the the 2D point in image and 3D point, but the result is not the same as the npy saved in this repo.

ghoshaw avatar Nov 15 '19 05:11 ghoshaw

@sanjaysswami But the value of the rotational and translation matrices in this repo is not equal to the PNP results. I tried to solvePnP using the the 2D point in image and 3D point, but the result is not the same as the npy saved in this repo.

Maybe the result should be converted by np.T, I transformed those results into 8 corners which is the objects' control point and find the result is not correct. But I finally get the correct result after transformed the result matrix.

zhangxiaodi avatar Jan 09 '20 10:01 zhangxiaodi

Hi @sanjaysswami were you able to run DenseFusion using the dataset created via this? Also, did you made a new object dataset for this?

Thanks!

nihar0602 avatar Apr 10 '21 23:04 nihar0602

Hi @sanjaysswami were you able to run DenseFusion using the dataset created via this? Also, did you made a new object dataset for this?

Thanks!

I have tried, the answer is yes. But success rate is very low. And DenseFusion works poorly in the generated dataset.

YanqingWu avatar Apr 16 '21 04:04 YanqingWu

@YanqingWu

Thank you very much for responding back to my query. Have you tried running it on SingleShotPose that as suggested by the author?

Thanks again!

nihar0602 avatar Apr 22 '21 14:04 nihar0602

@YanqingWu

Thank you very much for responding back to my query. Have you tried running it on SingleShotPose that as suggested by the author?

Thanks again!

Sorry, i did not try SinglePose.

YanqingWu avatar Apr 23 '21 01:04 YanqingWu