[Contribute] Web/App Remote Control
📋 Issue Description
Create a web/mobile app interface for remote robot control that automatically connects to the onboard computer and enables control beyond local network (internet-based remote operation).
🎯 What We Need
Core Features:
- Auto-discovery and connection to robot's onboard computer
- Remote control beyond local network (internet access)
- Real-time video streaming from robot cameras
- Basic keyboard/button control interface
- Optional: Connection encryption for security
Control Interface:
- Dual-arm joint control
- Base movement (forward/backward/rotate)
- Gripper control
- Emergency stop
- Camera feed switching
🔧 Expected Approach
Option 1: Web-based Solution
- WebRTC for low-latency video streaming
- WebSocket for real-time control commands
- Progressive Web App (PWA) for mobile compatibility
- Tunneling service (ngrok, tailscale, or custom solution)
Option 2: Mobile App
- React Native/Flutter for cross-platform
- Custom networking protocol or REST API
- Built-in camera streaming
Connection Architecture
[Mobile/Web Client] <-> [Internet/VPN] <-> [Robot Onboard Computer] <-> [XLeRobot Hardware]
📝 Acceptance Criteria
Minimum Requirements:
- [ ] Web/app interface with basic robot controls
- [ ] Real-time camera feed (acceptable latency <500ms)
- [ ] Works over internet (not just local network)
- [ ] Auto-connect to robot when available
- [ ] Emergency stop functionality
- [ ] Basic setup documentation
Bonus Points:
- [ ] Encrypted connection
- [ ] Multiple camera switching
- [ ] Connection status indicators
- [ ] Mobile app version
- [ ] User authentication
💡 Technical Considerations
Networking Options:
- WebRTC for peer-to-peer connection
- VPN tunnel (WireGuard/OpenVPN)
- Reverse proxy with tunneling (ngrok, cloudflare tunnel)
- Custom signaling server
Security Options:
- TLS/SSL encryption
- Token-based authentication
- Rate limiting for safety
🤝 Propose Your Approach
Comment with your implementation plan! Include:
- Web app vs mobile app preference
- Networking/tunneling solution choice
- Video streaming approach
- Security considerations
- Timeline estimate
Example:
I'd like to work on this! My approach:
- Web-based solution using React + WebRTC
- Use tailscale for secure tunneling
- WebSocket for control commands
- Start with single camera feed, expand later
- Timeline: ~4 weeks
This will unlock remote operation capabilities, making XLeRobot accessible from anywhere in the world and preparing the foundation for future LLM/VLM integration!
Plan:
Robot Side:
FastAPI: server on the robot python-socketio: To receive control commands and broadcast robot status via WebSockets. aiortc: low-latency video from the camera using WebRTC. Tailscale: tunneling
User Side (Web-based):
React: UI socket.io-client: Send user commands Browser WebRTC API: Receive and display live video nipplejs: Joystick
Estimated Timeline:
9.8: Start implementation. 9.15: Deliver a minimum viable demo, including the web application and robot-side code, running in MuJoCo. 9.16: Buy a robot (hopefully the official robot is released before 9.16 🤞 ) 9.20 (Not sure how long to receive it): Test the program on the real robot; Later: If all stages are successful, migrate the application to mobile with React Native.
I have partly finished the framework of the live streaming.
The connection between robot side server and web application has been figured out.
I will start to design UI for the application later today.
Plan:
Robot Side:
FastAPI: server on the robot python-socketio: To receive control commands and broadcast robot status via WebSockets. aiortc: low-latency video from the camera using WebRTC. Tailscale: tunneling
User Side (Web-based):
React: UI socket.io-client: Send user commands Browser WebRTC API: Receive and display live video nipplejs: Joystick
Estimated Timeline:
9.8: Start implementation. 9.15: Deliver a minimum viable demo, including the web application and robot-side code, running in MuJoCo. 9.16: Buy a robot (hopefully the official robot is released before 9.16 🤞 ) 9.20 (Not sure how long to receive it): Test the program on the real robot; Later: If all stages are successful, migrate the application to mobile with React Native.
Thanks so much for taking over this task! We should be releasing the world-wide purchase link in 2 days. Which region are you located? I am in the US. I purchased the motor from the manufaturer wowrobo before and the motors were delivered within 5 days.
Thanks for replying.
I am currently in China, so it might be a little bit slow if you ship the robot from the US.
Luckily, I have a Linux device so I can run ManiSkill on it and test the remote control from another computer.
Designed this simple UI with joystick-like control.
The web application can now control the robots in MuJoCo simulators with simple movement commands.
Later I will add joint control support and test the tuneling on differencet devices.
I am happy to take any suggestions u have @Vector-Wangel :)
Thanks for replying.
I am currently in China, so it might be a little bit slow if you ship the robot from the US.
Luckily, I have a Linux device so I can run ManiSkill on it and test the remote control from another computer.
Actually we released the assembly kit for China first lol. https://e.tb.cn/h.SZFbBgZABZ8zRPe?tk=ba514rTBRjQ You can check it out here. It should be able to be delivered in one week. And I look forward to seeing this be plugged into Maniskill!
Designed this simple UI with joystick-like control.
The web application can now control the robots in MuJoCo simulators with simple movement commands.
Later I will add joint control support and test the tuneling on differencet devices.
I am happy to take any suggestions u have @Vector-Wangel :)
This is fantastic! The UI looks really nice with all the functions needed. I think we can first test it out on Maniskill and then the real robot. After that I can directly merge your pull request and make it the official web UI for XLeRobot.
Designed this simple UI with joystick-like control.
The web application can now control the robots in MuJoCo simulators with simple movement commands.
Later I will add joint control support and test the tuneling on differencet devices.
I am happy to take any suggestions u have @Vector-Wangel :)
Btw you can also refer to the remote control codes to see how we can do the whole body control on this platform.
Thanks for replying. I am currently in China, so it might be a little bit slow if you ship the robot from the US. Luckily, I have a Linux device so I can run ManiSkill on it and test the remote control from another computer.
Actually we released the assembly kit for China first lol. https://e.tb.cn/h.SZFbBgZABZ8zRPe?tk=ba514rTBRjQ You can check it out here. It should be able to be delivered in one week. And I look forward to seeing this be plugged into Maniskill!
Oh I saw the purchase link, I will place the order after I complete the ManiSkill web control.
Designed this simple UI with joystick-like control. The web application can now control the robots in MuJoCo simulators with simple movement commands. Later I will add joint control support and test the tuneling on differencet devices.
[  ](https://private-user-images.githubusercontent.com/32699357/488287135-cca7bd8d-4e35-49bd-b3dc-9c93aba08c62.gif?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc2NDcwMTIsIm5iZiI6MTc1NzY0NjcxMiwicGF0aCI6Ii8zMjY5OTM1Ny80ODgyODcxMzUtY2NhN2JkOGQtNGUzNS00OWJkLWIzZGMtOWM5M2FiYTA4YzYyLmdpZj9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA5MTIlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwOTEyVDAzMTE1MlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWJjMjNmZTJhYjVmZTM4Y2ZjNDMzNmQ3ZjNmZjg5MzhiMmUyOTJmNDUzZTcwZTI4ZDA3MzlkYWVjNjgzMmVlMjkmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.EnOoWqZwAwjN_4LtaOUGh2VVo3ptYK08WpI1aCmlAXc) [ ](https://private-user-images.githubusercontent.com/32699357/488287135-cca7bd8d-4e35-49bd-b3dc-9c93aba08c62.gif?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc2NDcwMTIsIm5iZiI6MTc1NzY0NjcxMiwicGF0aCI6Ii8zMjY5OTM1Ny80ODgyODcxMzUtY2NhN2JkOGQtNGUzNS00OWJkLWIzZGMtOWM5M2FiYTA4YzYyLmdpZj9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA5MTIlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwOTEyVDAzMTE1MlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWJjMjNmZTJhYjVmZTM4Y2ZjNDMzNmQ3ZjNmZjg5MzhiMmUyOTJmNDUzZTcwZTI4ZDA3MzlkYWVjNjgzMmVlMjkmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.EnOoWqZwAwjN_4LtaOUGh2VVo3ptYK08WpI1aCmlAXc) [  ](https://private-user-images.githubusercontent.com/32699357/488287135-cca7bd8d-4e35-49bd-b3dc-9c93aba08c62.gif?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc1ODkyMjUsIm5iZiI6MTc1NzU4ODkyNSwicGF0aCI6Ii8zMjY5OTM1Ny80ODgyODcxMzUtY2NhN2JkOGQtNGUzNS00OWJkLWIzZGMtOWM5M2FiYTA4YzYyLmdpZj9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA5MTElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwOTExVDExMDg0NVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTE1ODRjYzIyNWEzYmQ4YTBmMjFiOTE0ZGJmYjJhZjIzMjVhNTQzYWNmMmVlZGUzMzk2OGMzM2M1ZGIzZTUzZTcmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.kaGZ0fhXr0cxZMPkvKqa3_8YNPyMqUfWYS-0Z0AaFSk) [  ](https://private-user-images.githubusercontent.com/32699357/488287135-cca7bd8d-4e35-49bd-b3dc-9c93aba08c62.gif?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc1ODkyMjUsIm5iZiI6MTc1NzU4ODkyNSwicGF0aCI6Ii8zMjY5OTM1Ny80ODgyODcxMzUtY2NhN2JkOGQtNGUzNS00OWJkLWIzZGMtOWM5M2FiYTA4YzYyLmdpZj9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA5MTElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwOTExVDExMDg0NVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTE1ODRjYzIyNWEzYmQ4YTBmMjFiOTE0ZGJmYjJhZjIzMjVhNTQzYWNmMmVlZGUzMzk2OGMzM2M1ZGIzZTUzZTcmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.kaGZ0fhXr0cxZMPkvKqa3_8YNPyMqUfWYS-0Z0AaFSk) [ ](https://private-user-images.githubusercontent.com/32699357/488287135-cca7bd8d-4e35-49bd-b3dc-9c93aba08c62.gif?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc1ODkyMjUsIm5iZiI6MTc1NzU4ODkyNSwicGF0aCI6Ii8zMjY5OTM1Ny80ODgyODcxMzUtY2NhN2JkOGQtNGUzNS00OWJkLWIzZGMtOWM5M2FiYTA4YzYyLmdpZj9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA5MTElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwOTExVDExMDg0NVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTE1ODRjYzIyNWEzYmQ4YTBmMjFiOTE0ZGJmYjJhZjIzMjVhNTQzYWNmMmVlZGUzMzk2OGMzM2M1ZGIzZTUzZTcmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.kaGZ0fhXr0cxZMPkvKqa3_8YNPyMqUfWYS-0Z0AaFSk) [ ](https://private-user-images.githubusercontent.com/32699357/488287135-cca7bd8d-4e35-49bd-b3dc-9c93aba08c62.gif?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc1ODkyMjUsIm5iZiI6MTc1NzU4ODkyNSwicGF0aCI6Ii8zMjY5OTM1Ny80ODgyODcxMzUtY2NhN2JkOGQtNGUzNS00OWJkLWIzZGMtOWM5M2FiYTA4YzYyLmdpZj9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA5MTElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwOTExVDExMDg0NVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTE1ODRjYzIyNWEzYmQ4YTBmMjFiOTE0ZGJmYjJhZjIzMjVhNTQzYWNmMmVlZGUzMzk2OGMzM2M1ZGIzZTUzZTcmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.kaGZ0fhXr0cxZMPkvKqa3_8YNPyMqUfWYS-0Z0AaFSk)I am happy to take any suggestions u have @Vector-Wangel :)
Btw you can also refer to the remote control codes to see how we can do the whole body control on this platform.
Yes, I will post code once I have verified my code with TailScale and ManiSkill.
Have you considered using Phosphobot for this? Seems to have a number of features