bullet3
bullet3 copied to clipboard
Inverse dynamics from body MoCap data
Hi,
I'm trying to obtain the forces (both external contact forces and motor forces) from human body MoCap data (e.g. Human3.6M). I tried the "deepmimic" and the "humanoidMotionCapture.py" example. "deepmimic" provides me with a realistic animation given the Human3.6M data. While "humanoidMotionCapture.py" has a humanoid with root position being fixed and without contact with the ground. But I'm struggling in modifying both of the code to solve my task.
I am pretty new of using Bullet and just have basic understanding towards the inverse dynamics process. So I wonder (1) can we really obtain reasonble forces estimation from the MoCap data? (2) if there is any existing work that has already done this? Would you mind to point out some references?
Thanks, Yufei
@zhangy76, the mocap file has only the time_stamp, root_link pos/orn and the joint angles. It is like a video playback.
Deep mimic uses the joint angles at a specific time-stamp("in phase, according to the paper") to compare and learn from the simulation.
humanoid_stable_pd.py
and humanoid_pose_interpolator.py
use this playback and compute the difference in the joint angles to the compute a reward.
(1) In order to get the joint torques, both applied and experienced, you need to query them during the run-time of the policy. Forces can be computed only when there is contact, in the mocap data, there is no contact, even if there is some contact between the ground and the humanoid, it is forced. This might not give reliable results.