This year’s Cloth Track of the 9th RGMC focuses on grasp point localisation on hanging cloth items. Teams can build their perception system in the months leading up to the competition, but performance will be evaluated on a dual-arm robot setup live May 13-17 at ICRA 2024 in Yokohama, Japan! You provide the grasp location and approach direction, and we handle the execution on our robots. Are you as enthusiastic about robotic laundry as we are? Register for the Cloth Track here and help make it happen!
Regrasping garments in the air is a popular cloth unfolding strategy. To get started, it’s common to grasp the lowest point. However, better grasp points are needed to unfold the garment completely, such as the shoulders of a shirt or the waist of shorts. The objective of the competition is to localise those grasp points in colour and depth images of garments held in the air by one gripper. Besides the grasp location, you will also have to provide an approach direction. We will evaluate your predicted grasp by executing it on a dual UR5e setup live at ICRA. The cloth items that must be grasped will be towels, shirts and shorts. We will provide an image dataset of representative cloth items for preparation, but the evaluation will also include unseen garments. Additional data collection by participants is allowed and even encouraged.
Participants do not have to worry about the initial lifting of the cloth, grasp execution, camera calibration, etc. We will take care of all that with our open-source codebase airo-mono! At the competition, we will send you an image and you just send us a grasp back in a to-be-specified JSON format. Together with the dataset, we will also provide a Github repository with notebooks and tools to load, visualise and explore the data. Don’t hesitate to contact us if you have any questions!
- Victor-Louis De Gusseme, PhD candidate (firstname.lastname@example.org)
- Francis wyffels, PhD, professor Robotics & AI (Francis.wyffels@UGent.be)