Skip to content

Latest commit

 

History

History
160 lines (119 loc) · 5.74 KB

README.md

File metadata and controls

160 lines (119 loc) · 5.74 KB

Grasping Benchmarks

VGN Benchmark environment for grasping in cluttered environments.

About The Project

MIT License Product Name Screen Shot

This repo contains a easy to use and modular implementation of the benchmark environment for grasping in cluttered environments introduces by Breyer et. al [1]. Note that we incoorperate slight modifications as done by Huang et. al [2] to allow the network to skip and acquire a new viewpoint.

Getting Started

This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.

Prerequisites

Installation

Installing the benchmark environment

# Clone the repo

# Install the requirements
pip install -r requirements.txt

# Install the package
pip install -e .

# Download the datasets and checkpoints
python scripts/download_data.py

(back to top)

Usage

Benchmark different Networks

ICG-Net

# Evaluate in packed scene
python scripts/test_icgnet.py --scene packed --object-set packed/test

# Evaluate in pile scene
python scripts/test_icgnet.py --scene pile --object-set pile/test

Edge-Grasp Network

# Evaluate in packed scene
python scripts/test_edge.py --method edge-vn --scene packed --object-set packed/test

# Evaluate in pile scene
python scripts/test_edge.py --method edge-vn --scene pile --object-set pile/test

VN-Edge-Grasp Network

# Evaluate in packed scene
python scripts/test_edge.py --method edge-vn --scene packed --object-set packed/test

# Evaluate in pile scene
python scripts/test_edge.py --method edge-vn --scene pile --object-set pile/test

GIGA Network

# Evaluate in packed scene
python scripts/test_giga.py --scene packed --object-set packed/test

# Evaluate in pile scene
python scripts/test_giga.py --scene pile --object-set pile/test

Usage

Will be updated soon.

(back to top)

License

Distributed under the BSD-2 License. See LICENSE.txt for more information.

(back to top)

Citing

If you use this code in your research, please cite the following paper:

@article{zurbrugg2024icgnet,
  title={ICGNet: A Unified Approach for Instance-Centric Grasping},
  author={Zurbr{\"u}gg, Ren{\'e} and Liu, Yifan and Engelmann, Francis and Kumar, Suryansh and Hutter, Marco and Patil, Vaishakh and Yu, Fisher},
  journal={arXiv preprint arXiv:2401.09939},
  year={2024}
}

Also consider citing the original work by Breyer et. al that introduced the benchmark environment:

@inproceedings{breyer2020volumetric,
 title={Volumetric Grasping Network: Real-time 6 DOF Grasp Detection in Clutter},
 author={Breyer, Michel and Chung, Jen Jen and Ott, Lionel and Roland, Siegwart and Juan, Nieto},
 booktitle={Conference on Robot Learning},
 year={2020},
}