cnngeometric_pytorch icon indicating copy to clipboard operation
cnngeometric_pytorch copied to clipboard

How to get predicted target point from result and source point

Open binhmuc opened this issue 5 years ago • 9 comments

Thank for your source code ! Could you tell me how to get target point from pretrained model and the source point. I looking for "eval_pf.py" but look like you get source point from target point...

binhmuc avatar Jul 01 '19 17:07 binhmuc

Also your paper said that "A keypoint is considered to be matched correctly if its predicted location is within a distance of α · max(h, w) of the target keypoint position". So i don't know why you code compare with source points instead of target points.

binhmuc avatar Jul 01 '19 17:07 binhmuc

Hi, this is just a matter of terminology. For consistency with the ProposalFlow paper that uses inverse warping we also transform the points from the second image (target) to the first image (source). In the paper we explain it in the opposite way because it's more natural. But remember "source" and "target" are just two names. Sorry for the confusion.

Le lun. 1 juil. 2019 à 19:43, binhmuc [email protected] a écrit :

Also your paper said that "A keypoint is considered to be matched correctly if its predicted location is within a distance of α · max(h, w) of the target keypoint position". So i don't know why you code compare with source points instead of target points.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ignacio-rocco/cnngeometric_pytorch/issues/11?email_source=notifications&email_token=AC3LX3BISMD5OWRW7TMVAUDP5I63PA5CNFSM4H4UNVZ2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODY63FBQ#issuecomment-507359878, or mute the thread https://github.com/notifications/unsubscribe-auth/AC3LX3CRLFI7UJ2GYJO3HFTP5I63PANCNFSM4H4UNVZQ .

ignacio-rocco avatar Jul 01 '19 21:07 ignacio-rocco

Thank for you reply :) So, it means that, i just replace "source points" and "target point" in the code and got the natural result ?. But it too weird for me... Because in the code: You warped source images -> target images, and using theta result to get inverse warping... So, could you tell me, how to get target point from source points and theta result ? Thank you !

binhmuc avatar Jul 02 '19 00:07 binhmuc

Please see the explanations about inverse warping here:

https://www.cs.unc.edu/~lazebnik/research/fall08/lec08_faces.pdf

this should help you understand!

ignacio-rocco avatar Jul 02 '19 09:07 ignacio-rocco

so do you understand his means? I'm also confused this opinions.

lixiaolusunshine avatar Sep 17 '19 12:09 lixiaolusunshine

@lixiaolusunshine yes, i understood him. Clearly that, the paper said that source points to target points, but in the source code is totally inverse.

binhmuc avatar Sep 17 '19 12:09 binhmuc

so in his paper he got the estimated inverse affine parameters from the featuregression layer, then use this inverse mapping to warp the source image into the target image?

lixiaolusunshine avatar Sep 18 '19 01:09 lixiaolusunshine

@lixiaolusunshine sorry, i cannot catch up your mean. In his paper, very clear that, use GMM, find a list of parameters, from parameters => warp => loss. The only difference is when he compare the result. He compare the target points, but in code, we never get target points for the parameters, instead of is source points.

binhmuc avatar Sep 18 '19 01:09 binhmuc

@binhmuc Thanks for your issue.

Link about inverse points method is broken. If you know how to do the inverse points, I would like to know. The owner of this source code does not appear to be replying at this time.

Please see the explanations about inverse warping here:

https://www.cs.unc.edu/~lazebnik/research/fall08/lec08_faces.pdf

this should help you understand!

tkrtr avatar Jan 26 '24 00:01 tkrtr