MultiGripperGrasp: A Dataset for Robotic Grasping from Parallel Jaw Grippers to Dexterous Hands

The University of Texas at Dallas     
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2024
*equal contribution
gto

Abstract

We introduce a large-scale dataset named MultiGripperGrasp for robotic grasping. Our dataset contains 30.4M grasps from 11 grippers for 345 objects. These grippers range from two-finger grippers to five-finger grippers, including a human hand. All grasps in the dataset are verified in Isaac Sim to classify them as successful and unsuccessful grasps. Additionally, the object fall-off time for each grasp is recorded as a grasp quality measurement. Furthermore, the grippers in our dataset are aligned according to the orientation and position of their palms, allowing us to transfer grasps from one gripper to another. The grasp transfer significantly increases the number of successful grasps for each gripper in the dataset. Our dataset is useful to study generalized grasp planning and grasp transfer across different grippers.

Code

Grasp Ranking: Isaac Sim

Python code for grasp ranking and grasp transfer using Isaac Sim.

Grasp Candidates: Graspit!-based generation

Python code for candidate grasp generation using GraspIt! python interface.

MultiGripperGrasp Dataset

Dataset: Grasp Files + Object 3D Models + USD

Box.com (no login required) link for the overall dataset folder. Includes the 3d models and the grasp file for each (object, gripper) pair including the negative grasps. Each pair's grasps are saved in .json file with the grasp poses, fall times and other details. Check out the README in the folder for more details.

Citation (BibTeX)

Please cite MultiGripperGrasp if it helps your research:
@inproceedings{casas2024multigrippergrasp,
      title={MultiGripperGrasp: A Dataset for Robotic Grasping from Parallel Jaw Grippers to Dexterous Hands},
      author={Casas, Luis Felipe and Khargonkar, Ninad and Prabhakaran, Balakrishnan and Xiang, Yu},
      booktitle={IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
      year={2024}
    }    

Contact

Send any comments or questions to Luis Felipe Casas: Luis.CasasMurillo@UTDallas.edu
or Ninad Khargonkar: ninadarun.khargonkar@utdallas.edu

Acknowledgements

This work was supported by the Sony Research Award Program.