Skip to content

Rahtron3030/gpd

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Grasp Pose Detection (GPD)

1) Overview

This package detects 6-DOF grasp poses for a 2-finger grasp (e.g. a parallel jaw gripper) in 3D point clouds.

Grasp pose detection consists of three steps: sampling a large number of grasp candidates, classifying these candidates as viable grasps or not, and clustering viable grasps which are geometrically similar.

The reference for this package is: High precision grasp pose detection in dense clutter.

UR5 Demo

UR5 demo

2) Requirements

  1. PCL 1.7 or later
  2. Eigen 3.0 or later
  3. ROS Indigo
  4. Caffe

3) Prerequisites

The following instructions have been tested on Ubuntu 14.04. Similar instructions should work for other Linux distributions that support ROS.

  1. Install Caffe (Instructions). Follow the CMake Build instructions. Notice: Due to a conflict between the Boost version required by Caffe (1.55) and the one installed as a dependency with the Debian package for ROS Indigo (1.54), you need to checkout an older version of Caffe that worked with Boost 1.54. So, when you clone Caffe, please use the command below instead.

    git clone https://github.com/BVLC/caffe.git && cd caffe && git checkout 923e7e8b6337f610115ae28859408bc392d13136
    
  2. Install ROS Indigo (Instructions).

  3. Clone the grasp_pose_generator repository into some folder:

    $ cd <location_of_your_workspace>
    $ git clone https://github.com/atenpas/gpg.git
    
  4. Build and install the grasp_pose_generator:

    $ cd gpg
    $ mkdir build && cd build
    $ cmake ..
    $ make
    $ sudo make install
    

4) Compilation

  1. Clone this repository.

    $ cd <location_of_your_workspace/src>
    $ git clone https://github.com/atenpas/gpd.git
    
  2. Build your catkin workspace.

    $ cd <location_of_your_workspace>
    $ catkin_make
    

5) Generate Grasps for a Point Cloud File

Launch the grasp pose detection on an example point cloud:

roslaunch gpd tutorial0.launch

Within the GUI that appears, press r to center the view, and q to quit the GUI and load the next visualization. The output should look similar to the screenshot shown below.

rviz screenshot

6) Tutorials

  1. Detect Grasps With an RGBD camera
  2. Detect Grasps on a Specific Object

7) Parameters

Brief explanations of parameters are given in launch/classify_candidates_file_15_channels.launch for using PCD files. For use on a robot, see launch/ur5_15_channels.launch.

8) Views

rviz screenshot

You can use this package with a single or with two depth sensors. The package comes with weight files for Caffe for both options. You can find these files in gpd/caffe/15channels. For a single sensor, use single_view_15_channels.caffemodel and for two depth sensors, use two_views_15_channels_[angle]. The [angle] is the angle between the two sensor views, as illustrated in the picture below. In the two-views setting, you want to register the two point clouds together before sending them to GPD.

rviz screenshot

To switch between one and two sensor views, change the parameter trained_file in the launch file launch/caffe/ur5_15channels.launch.

9) Input Channels for Neural Network

The package comes with weight files for two different input representations for the neural network that is used to decide if a grasp is viable or not: 3 or 15 channels. The default is 15 channels. However, you can use the 3 channels to achieve better runtime for a loss in grasp quality. For more details, please see the reference below.

10) Citation

If you like this package and use it in your own work, please cite our paper:

[1] Marcus Gualtieri, Andreas ten Pas, Kate Saenko, Robert Platt. High precision grasp pose detection in dense clutter. IROS 2016. 598-605.

About

Detect grasp poses in point clouds

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 89.8%
  • CMake 6.4%
  • Python 3.8%