ICRA 2024 - Cloth Competition

This year’s Cloth Track of the 9th RGMC focuses on grasp point localisation on hanging cloth items. Teams can build their perception system in the months leading up to the competition, but performance will be evaluated on a dual-arm robot setup live May 13-15 at ICRA 2024 in Yokohama, Japan! You provide the grasp, and we handle the execution on our robots.

We’re thrilled to have 16 dedicated teams registered and actively preparing for the cloth manipulation challenge!

Left: Dual UR5e setup we will provide at ICRA 2024. Right: Grasp point localisation on a hanging shirt.

Important links for participants

The Challenge

Regrasping garments in the air is a popular cloth unfolding strategy. To get started, it’s common to grasp the lowest point. However, better grasp points are needed to unfold the garment completely, such as the shoulders of a shirt or the waist of shorts. The objective of the competition is to localise good grasp poses in colour and depth images of garments held in the air by one gripper. We will evaluate your predicted grasp by executing it on a dual UR5e setup live at ICRA. The cloth items that must be grasped will be towels, shirts and shorts. We will provide an image dataset of representative cloth items for preparation, but the evaluation will also include unseen garments.

Practicalities

Participants do not have to worry about the initial lifting of the cloth, grasp execution, camera calibration, etc. We will take care of all that with our open-source codebase airo-mono! At the competition, we will send you an image and you just send us a grasp back in a to-be-specified JSON format. Together with the dataset, we will also provide a Github repository with notebooks and tools to load, visualise and explore the data. Don’t hesitate to contact us if you have any questions!

Teams

RICloth

  • Yuhan Jin
  • Dimitrios Rakovitis

UOS-Robotics

  • Uihun Sagong
  • JungHyun Choi
  • JeongHyun Park
  • Dongwoo Lee
  • Yeongmin Kim

Ewha Glab

  • Hyojeong Yu
  • Minseo Kwon

Biomedical Sensors & Signals Group

  • Yixin Deng
  • Keru Wang
  • Qiang Wang

Team MIT

  • Neha Sunil
  • Megha Tippur

AI&ROBOT LAB

  • Chongkun Xia
  • Kai Mo
  • Yanzhao Yu
  • Keith Lin
  • Binqiang Ma

Shibata Lab

  • Kakeru Yamasaki
  • Tomohiro Shibata
  • Krati Saxena
  • Takumi Kajiwara
  • Yuki Nakadera

AIS Shinshu

  • Solvi Arnold
  • Kimitoshi Yamazaki
  • Daisuke Tanaka
  • Keisuke Onda
  • Akihisa Ishikawa
  • Yusuke Kuribayashi
  • Naoki Hiratsuka

Team Greater Bay

  • Fang Wan
  • Linhan Yang
  • Haoran Sun
  • Ning Guo
  • Chaoyang Song
  • Jia Pan
  • Lei Yang
  • Zeqing Zhang

Team Ljubljana

  • Domen Tabernik
  • Andrej Gams
  • Peter Nimac
  • Matej Urbas
  • Jon Muhovič
  • Danijel Skočaj
  • Matija Mavsar

SCUT-ROBOT

  • Supeng Diao
  • Yang Cong
  • Yu Ren
  • Ronghan Chen
  • Jiawei Weng
  • Jiayue Liu

Air-jnu

  • Giwan Lee
  • Jiyoung Choi
  • Jeongil Choi
  • Geon Kim
  • Phayuth Yonrith

CASIA-YIT-HAIZHICHEN

  • Xiaodong Zhang
  • Bin Hu
  • Xinjie Liu
  • Tao Cheng
  • Rurui Xue

Samsung Research China – Beijing (SRC-B)

  • Yixiang Jin
  • Dingzhe Li
  • Yong A
  • Jun Shi
  • Yong Yang

Soft Robotics Lab ETHZ

  • Hehui Zheng
  • Barnabas Gavin Cangan

3C1S

  • Carlos Mateo-Agullo
  • Naveed Abbasi
  • Michael Ibrahim

Contact

Partners