ECON
ECON copied to clipboard
How to use ECON output to train SCANimate
Thanks again for your work! I still have two question about how to modify the output of ECON so that it can be used for training SCANimate. #51
The first question is that How can I transform the body_pose parameter in SMPL-X to pose parameter in SMPL? I am confused with what exactly those 24 joints are, to be more specific, the 23 body joints which is 24 in SMPLX model.
# SMPL-X (~.npy) produced by ECON
betas : torch.Size([1, 200])
body_pose : torch.Size([1, 21, 3])
global_orient : torch.Size([1, 1, 3])
transl : torch.Size([1, 3])
expression : torch.Size([1, 50])
jaw_pose : torch.Size([1, 1, 3])
left_hand_pose : torch.Size([1, 15, 3])
right_hand_pose : torch.Size([1, 15, 3])
scale : torch.Size([1, 1])
# SMPL (~.npz) needed by SCANimate
transl (3,)
pose (72,)
v_cano (6890, 3)
v_posed (6890, 3)
In addition, I also have some trouble in removing the invisible faces. I have already checked the query_color function and try to find those invisible part. But I could only remove those invisible vertices, what should I do next to further remove those invisible faces?
# I tried to remove some invisible vertices through this way
vert = vert[visibility == 1.0]
Looking forward to any helpful suggestion😄
The first question is that How can I transform the body_pose parameter in SMPL-X to pose parameter in SMPL? I am confused with what exactly those 24 joints are, to be more specific, the 23 body joints which is 24 in SMPLX model.
# SMPL-X (~.npy) produced by ECON betas : torch.Size([1, 200]) body_pose : torch.Size([1, 21, 3]) global_orient : torch.Size([1, 1, 3]) transl : torch.Size([1, 3]) expression : torch.Size([1, 50]) jaw_pose : torch.Size([1, 1, 3]) left_hand_pose : torch.Size([1, 15, 3]) right_hand_pose : torch.Size([1, 15, 3]) scale : torch.Size([1, 1]) # SMPL (~.npz) needed by SCANimate transl (3,) pose (72,) v_cano (6890, 3) v_posed (6890, 3)
For this question, I have found the 23 = the same 21 body_pose as in SMPLX and 2 for hands. Is it correct? Can I just set both hands to (0, 0, 0) or how can I get the pose parameters for them?
The first question is that How can I transform the body_pose parameter in SMPL-X to pose parameter in SMPL? I am confused with what exactly those 24 joints are, to be more specific, the 23 body joints which is 24 in SMPLX model.
# SMPL-X (~.npy) produced by ECON betas : torch.Size([1, 200]) body_pose : torch.Size([1, 21, 3]) global_orient : torch.Size([1, 1, 3]) transl : torch.Size([1, 3]) expression : torch.Size([1, 50]) jaw_pose : torch.Size([1, 1, 3]) left_hand_pose : torch.Size([1, 15, 3]) right_hand_pose : torch.Size([1, 15, 3]) scale : torch.Size([1, 1]) # SMPL (~.npz) needed by SCANimate transl (3,) pose (72,) v_cano (6890, 3) v_posed (6890, 3)
For this question, I have found the 23 = the same 21 body_pose as in SMPLX and 2 for hands. Is it correct? Can I just set both hands to (0, 0, 0) or how can I get the pose parameters for them?
For hand pose params, I set them by transfering the smplx model to the smpl model and calculating their value using the generated .pkl
Update(23/8/9): However, the result is very strange. It seems there are some problems in params and I tried to normalize them, but they still works poor...