pick-and-place-with-icl-ur5-robotiq-gripper
pick-and-place-with-icl-ur5-robotiq-gripper copied to clipboard
This package enables an UR5 arm with a Robotiq 140 Gripper to be used with ros_control and MoveIt!. OpenNI Kinect Camera Plugin used to capture the scene in Gazebo simulation.
pick-and-place-with-icl-ur5-robotiq-gripper
Prerequisite:
- Ubuntu 18.04 LTS
- ROS Melodic
- MoveIt!
- OpenNI Kinect Package (Camera)
Getting Started:
1. Create new catkin workspace:
mkdir -p ur5_robotiq_ws/src
2. Git clone to the ur5_robotiq_ws/src:
cd ur5_robotiq_ws/src
git clone https://github.com/khs-sm/pick-and-place-with-icl-ur5-robotiq-gripper.git
3. Make sure the packages is in the directory as below:
ur5_robotiq_ws
│
└───src
│ find-object
│ gazebo-pkgs
| general-message-pkgs
| ......
| README.md
4. Install dependencies using rosdep install and perform catkin_make to build the project:
cd ur5_robotiq_ws
or cd ..
rosdep install --from-paths src --ignore-src -r -y
catkin_make
source the file before launch
source devel/setup.bash
Gazebo Simulation
1. Launch the UR5 Robotiq Gripper:
roslaunch icl_ur5_setup_gazebo icl_ur5_gripper.launch
2. Enable ROS Control on Simulation from MoveIt:
roslaunch icl_ur5_setup_moveit_config ur5_gripper_moveit_planning_execution.launch sim:=true
3. Launch Rviz with MoveIt:
roslaunch icl_ur5_setup_moveit_config moveit_rviz.launch config:=true
4. Test the Simulation:
rosrun scripts test_grasp.py
(OPTIONAL) Launch Object Detection with find_object_2d:
roslaunch find-object start_find_object_3d_session.launch
(OPTIONAL) Apply Object Detection to Motion Planning:
rosrun scripts vision_grasp.py
Debug
check on here