XLeRobot icon indicating copy to clipboard operation
XLeRobot copied to clipboard

I am a student currently studying. If I don’t have enough money to build a physical entity, can I achieve all the functions through simulation?

Open Sunday377 opened this issue 2 months ago • 6 comments

I am a student currently studying. If I don’t have enough money to build a physical entity, can I achieve all the functions through simulation?

Sunday377 avatar Sep 18 '25 04:09 Sunday377

Definitely. Right now the Maniskill simulation has all the teleop codes and sensory feedback. And we are actively working on the Maniskill hab environment for RL mobile manipulation and VLA.

However, I do suggest you buy/build a SO101 arm first (<200$) to get hands on the hardware first. The arm can also later be used for XLeRobot with the same LeRobot system.

Vector-Wangel avatar Sep 18 '25 05:09 Vector-Wangel

I don't see a doc to describe how to do dataset generation using sim. Is it possible with current code? I like the following flow which is quite useful :

1)sim to generate datasets. 2)train on a gpu 3)run inference on sim with the trained model on step 2 4)deploy to a so101 robot physically and hope the performance is similar to step 3

will current code be able to do step 1-3?

Thanks,

jcl2023 avatar Sep 19 '25 01:09 jcl2023

Yes, I have just quickly written some codes that can record the simulation in the latest lerobot dataset v3 version. You can also change the controller or modify the recording format yourself.

However, note that even today there's not a practical opensource VLA model for mobile manipulation, so I also suggest to start with a single arm. Leisaac is also a good choice to train VLA in isaac sim on a single arm.

Vector-Wangel avatar Sep 19 '25 02:09 Vector-Wangel

so I also suggest to start with a single arm Single arm should be OK. I will test your new sim sample code over the weekend. Do you have instructions on how to render a target and place a target on the table during dataset collection?

jcl2023 avatar Sep 19 '25 04:09 jcl2023

@Vector-Wangel

I ran the following command and giot error message:

python3 demo_ctrl_ee_keyboard_record_dataset.py

/home/fjh/miniconda3/envs/xlerobot/lib/python3.11/site-packages/torch/cuda/init.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. import pynvml # type: ignore[import] Initialized LeRobot dataset recorder at: datasets/maniskill_teleoperation Observation space Dict('agent': Dict('qpos': Box(-inf, inf, (1, 9), float32), 'qvel': Box(-inf, inf, (1, 9), float32)), 'extra': Dict('tcp_pose': Box(-inf, inf, (1, 7), float32)), 'sensor_param': Dict('base_camera': Dict('extrinsic_cv': Box(-inf, inf, (1, 3, 4), float32), 'cam2world_gl': Box(-inf, inf, (1, 4, 4), float32), 'intrinsic_cv': Box(-inf, inf, (1, 3, 3), float32))), 'sensor_data': Dict('base_camera': Dict('Segmentation': Box(-2147483648, 2147483647, (1, 128, 128, 4), int32), 'Position': Box(-inf, inf, (1, 128, 128, 4), float32), 'Color': Box(-inf, inf, (1, 128, 128, 4), float32)))) Action space Box(-1.0, 1.0, (8,), float32) Control mode pd_joint_delta_pos Reward mode normalized_dense Traceback (most recent call last): File "/home/fjh/Downloads/XLeRobot/simulation/Maniskill/examples/demo_ctrl_ee_keyboard_record_dataset.py", line 834, in main(parsed_args) File "/home/fjh/Downloads/XLeRobot/simulation/Maniskill/examples/demo_ctrl_ee_keyboard_record_dataset.py", line 460, in main target_joints[11] = 1.57 ~~~~~~~~~~~~~^^^^ IndexError: index 11 is out of bounds for axis 0 with size 8

jcl2023 avatar Sep 19 '25 23:09 jcl2023

dual arm code below is OK: python -m mani_skill.examples.demo_ctrl_action_ee_keyboard -e "ReplicaCAD_SceneManipulation-v1" -r "xlerobot" --render-mode="human" --shader="rt-fast" -c "pd_joint_delta_pos_dual_arm"

jcl2023 avatar Sep 20 '25 00:09 jcl2023