diffusion_policy icon indicating copy to clipboard operation
diffusion_policy copied to clipboard

wrong rotation of the predicted actions

Open yl-wang996 opened this issue 10 months ago • 9 comments

As in the following figure, I try to visualize the predicted actions in the rviz. In this case, each prediction will generate 16 waypotins and the first 8 will be used. But I found the rotation part of the action is incorrect, there seems exist a fixed offset in the rotation. And the action is directly from here.

So my question is: is this rotation correct as in the following image, if not what could be the problem?

image

yl-wang996 avatar Apr 25 '24 14:04 yl-wang996

When I visualize the trajectories of the TCP, the rotation seems correct. image

yl-wang996 avatar Apr 25 '24 14:04 yl-wang996

And then I found the action in the demonstration dataset is also not vertical, is it correct? image

yl-wang996 avatar Apr 25 '24 15:04 yl-wang996

Hello, I am a beginner, how to use rviz for visualization?

2292685327 avatar May 07 '24 03:05 2292685327

Hello, I am a beginner, how to use rviz for visualization?

@2292685327 Basically, you just need to know how to use ROS, and for rviz you can refer to here

yl-wang996 avatar May 07 '24 09:05 yl-wang996

Hello, I am a beginner, how to use rviz for visualization?

@2292685327 Basically, you just need to know how to use ROS, and for rviz you can refer to here

Thank you for your prompt reply. I have another question. Are the robot files and required target files in the simulation environment already provided in the source code? Or do I need to recreate it myself?

2292685327 avatar May 08 '24 10:05 2292685327

@2292685327 sorry for the late reply, the robomimic rely on the robosuite, so the env is built by robosuite. An alternative is you can find the mujoco XML file in the attributes of the dataset file(***.h5py), but the path is wrong in this XML file.

yl-wang996 avatar May 14 '24 18:05 yl-wang996

As in the following figure, I try to visualize the predicted actions in the rviz. In this case, each prediction will generate 16 waypotins and the first 8 will be used. But I found the rotation part of the action is incorrect, there seems exist a fixed offset in the rotation. And the action is directly from here.

So my question is: is this rotation correct as in the following image, if not what could be the problem?

image

—————————————————————————————————————————————— Hi, did you solved this problem, for my dataset i get the seem results: little error in pose position, but large eorro in rotation. i found they use rotation_6d in model,not 3x3_rot_matrix. I have tried the 'abs'/'rel'/'relative' as the action_reper, but all got the run rotation position.

cynthia-you avatar Aug 26 '24 02:08 cynthia-you

@cynthia-you I haven't tried, but I guess using the axis value with angle magnitude, which also means axis angle, could resolve this problem. I accidentally used Euler angle for drawing the rviz, which cause a mistake.

yl-wang996 avatar Aug 27 '24 19:08 yl-wang996

@2292685327 sorry for the late reply, the robomimic rely on the robosuite, so the env is built by robosuite. An alternative is you can find the mujoco XML file in the attributes of the dataset file(***.h5py), but the path is wrong in this XML file.

@yl-wang996 Thank you for your response. Is your visualization result based on the depth information from a global camera? Also, when visualizing through RViz, what specific information from the source code is needed for the camera and trajectory data, respectively? I look forward to your reply.

wudier483 avatar Sep 19 '24 07:09 wudier483