This repository contains the code for the Vision-Based Manipulation project as part of WPI's Vision Based Manipulation course. The project utilizes top surface information from an RGBD camera to find grasping points.
This project is developed and tested with ROS2 Humble and Gazebo version 11.10.2
The central focus of the repository is to find stable grasping points considering a parallel-jaw gripper and given the point cloud data of the top-surface of the object.
For ROS2 Humble installation on Ubuntu 22.04 refer to the following link Install ROS2 Humble
- Install Gazebo and its supplementry files:
sudo sh -c 'echo "deb http://packages.osrfoundation.org/gazebo/ubuntu-stable
`lsb_release -cs` main" > /etc/apt/sources.list.d/gazebo-stable.list'
wget https://packages.osrfoundation.org/gazebo.key -O - | sudo apt-key add -
- Update the apt repository:
sudo apt-get update
- Install Gazebo:
sudo apt-get install gazebo libgazebo-dev
Install Gazebo ROS2 packages:
sudo apt install ros-humble-gazebo-ros-pkgs
sudo apt-get install ros-humble-pcl-*
sudo apt install libpcl-dev
# Clone the repository in src folder
git clone https://github.com/hrishikesh-st/vbm_project.git
# Build the packages
cd vbm_project
colcon build
# Source the build files
source install/setup.bash
NOTE: You will need to use this command in every new terminal you open, or you can add it to your .bashrc file.
IMPORTANT: You need to initiate extra 4 terminal sessions to run the entire project end-to-end.
- Using the same terminal session used to source the build files spawn the simulation environment:
ros2 launch vbm_project_env simulation.launch.py
NOTE: This will spawn a camera, table and a desired object on the table in the Gazebo environment.
- Spawn second terminal session.
- Spawn the node transform_point_cloud.py from the vbm_project_grasping package.
- transform_point_cloud node publishes the transformed point cloud to /transformed_pointcloud_data topic.
ros2 run vbm_project_grasping transform_point_cloud.py
NOTE: The transformation for the current version is hardcoded into a transformation matrix. The future scope is to utilise the tf2 library
- Spawn third terminal session.
- Spawn the processPointCloud CPP node from the vbm_project_grasping package.
- processPointCloud node downsamples the data, removes the major plane and extracts the point cloud of the object of interest.
- This node subscribes to /transformed_pointcloud_data topic and then publishes the processed point cloud to /processed_pointcloud_data topic.
ros2 run vbm_project_grasping processPointCloud
- Spawn the fourth terminal session.
- Spawn the grasp_synthesis_node.py node from the vbm_project_grasping package:
ros2 run vbm_project_grasping grasp_synthesis_node.py
- Following operations are performed on the processed point cloud data:
- Distance thresholds the processed point cloud from the topmost point in the cloud and extracts the top-surface of the object.
- Now this top-surface is flattened to a 2D point cloud to perform the grasp point detection heuristic alogorithm.
- Queries the detected grasp points back to the 3D point cloud.
- This node subscribes to /processed_pointcloud_data topic and then publishes to two topics:
- /thresholded_pointcloud: to visualize the top-suface of the object.
- /grasping_points: to visualize the detected top-surface grasping points of the object.
- Spawn the fifth terminal session
- Spawn Rviz:
ros2 run rviz2 rviz2