epipolar-transformers
epipolar-transformers copied to clipboard
Epipolar Transformers (best paper award, CVPR 2020 workshop)
Hi, thanks for your work! When I run your code from four views, the results look good. But I when run the code with two views it seems bad! I...
Hi, Thanks for your awesome work. I noticed the InterHand2.6M dataset used in the paper has 257K 2D hands in the training set, however, the latest released InterHand2.6M has 528K...
Hi. I am trying to load your provided pre-trained models "resnet50-19c8e357.pth" and "pose_resnet_4.5_pixels_human36m.pth" to test, but I failed. 2021-07-08 10:47:24,038 checkpointer INFO: Loading checkpoint from datasets/resnet50-19c8e357.pth Traceback (most recent call...
Hi, thanks for opensource code. Can you please share the config/(checkpoint if possible) for training the dataset InterHand? Thank you.
I mean if you use 3 or 4 views in epipolar sample, like fuse features one more time by the 3rd view, you may get better results I guess. I'm...
As title shows, i meet great diffuculities in visualize the results. Even though the visualize_human.py is provided, how can i do the visualize from the predcitions generated by tester?