We introduce a large-scale dataset named MultiGripperGrasp for robotic grasping. Our dataset contains 30.4M grasps from 11 grippers for 345 objects. These grippers range from two-finger grippers to five-finger grippers, including a human hand. All grasps in the dataset are verified in Isaac Sim to classify them as successful and unsuccessful grasps. Additionally, the object fall-off time for each grasp is recorded as a grasp quality measurement. Furthermore, the grippers in our dataset are aligned according to the orientation and position of their palms, allowing us to transfer grasps from one gripper to another. The grasp transfer significantly increases the number of successful grasps for each gripper in the dataset. Our dataset is useful to study generalized grasp planning and grasp transfer across different grippers.
Python code for grasp ranking and grasp transfer using Isaac Sim.
Python code for candidate grasp generation using GraspIt! python interface.
Box.com (no login required) link for the overall dataset folder. Includes the 3d models and the grasp file for each (object, gripper) pair including the negative grasps. Each pair's grasps are saved in .json
file with the grasp poses, fall times and other details. Check out the README in the folder for more details.
@inproceedings{casas2024multigrippergrasp,
title={MultiGripperGrasp: A Dataset for Robotic Grasping from Parallel Jaw Grippers to Dexterous Hands},
author={Casas, Luis Felipe and Khargonkar, Ninad and Prabhakaran, Balakrishnan and Xiang, Yu},
booktitle={IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year={2024}
}
Send any comments or questions to Luis Felipe Casas: Luis.CasasMurillo@UTDallas.edu
or Ninad Khargonkar: ninadarun.khargonkar@utdallas.edu
This work was supported by the Sony Research Award Program.