Z-Z-J
Z-Z-J
I meet the same problem, have you solved it now?
> 我运行脚本 >  I solved this problem . sudo apt install ffmpeg
@NuochengTian Hi,you can provide the h36m's preprocessing script ? Thanks!!
 when 3D pose GT is taken as input, in Tab.4 the performance: 29.0 MPJPE and 23.0 PA-MPJPE on H36M. But in Tab.13, the performance: 13.9MPJPE 9.9 PA-MPJPE on H36M....
I attempt to train MeshNet with GT pose as input on H36M (using SMPL skeleton).   In training set, we observe that MPVE: 6.3 mm, MPJPE: 11.9mm But in...
Thank you for your reply! Can you provide the pre-training model (only training MeshNet with 3D GT)?
In the process of processing data (run.py line: 114), they remove global offset, but keep trajectory in first position. Because in previous works, such as Videopose3d, these works need to...
> > > > Hello and thanks for the answer. Do you also know why they aren't excluding this output completely? Wouldn't it be helpful to estimate the 3D keypoints...