ObjectDatasetTools
ObjectDatasetTools copied to clipboard
To check ground truth and mesh generated
@F2Wang I was trying to project back the ground truth generated but somehow it is not correct. Can you please tell me how I can project the ground truth back onto the RGB image and check it is correctly generated or not? Or if you have any other idea to cross check can you please tell me?
I am trying to generate real data set and use it to train Dense fusion.
Hi this should be automatically generated after running createlabelfiles, you can inspect it by running inspectMasks.py
@F2Wang
-
I tried inspect.py and it is working. But the it is using only mask and RGB images. I wanted to check if the ground truth files generated as transform.npy is correct or not. I am using cv2.projectPoints(U, V, X, Y, Z), where U: model points from .ply file, V: rotational 33 matrix, X: transnational 13 matrix, Y: camera intrinsic parameters 33 and Z: distortion matrix 15 if any. Then I project the image_points generated from this function back onto RGB image. Unfortunately it is working only for starting couple of frames but not for remaining. I really want to check how do I can check if it is correct or not.
-
Can I use my own ply file for my object instead. And use this ply file to generate mask and everything else?
@sanjaysswami can this repo create the Rotation and transnational matrix?
@sanjaysswami can this repo create the Rotation and transnational matrix?
YES. The ground truth pose is always in terms of rotational and translation matrices
@sanjaysswami But the value of the rotational and translation matrices in this repo is not equal to the PNP results. I tried to solvePnP using the the 2D point in image and 3D point, but the result is not the same as the npy saved in this repo.
@sanjaysswami But the value of the rotational and translation matrices in this repo is not equal to the PNP results. I tried to solvePnP using the the 2D point in image and 3D point, but the result is not the same as the npy saved in this repo.
Maybe the result should be converted by np.T, I transformed those results into 8 corners which is the objects' control point and find the result is not correct. But I finally get the correct result after transformed the result matrix.
Hi @sanjaysswami were you able to run DenseFusion using the dataset created via this? Also, did you made a new object dataset for this?
Thanks!
Hi @sanjaysswami were you able to run DenseFusion using the dataset created via this? Also, did you made a new object dataset for this?
Thanks!
I have tried, the answer is yes. But success rate is very low. And DenseFusion works poorly in the generated dataset.
@YanqingWu
Thank you very much for responding back to my query. Have you tried running it on SingleShotPose that as suggested by the author?
Thanks again!
@YanqingWu
Thank you very much for responding back to my query. Have you tried running it on SingleShotPose that as suggested by the author?
Thanks again!
Sorry, i did not try SinglePose.