At AIRO we value open-source hardware, software and datasets. This page offers a selection of interesting tools and datasets that we offer.
Datasets
Folding dataset
In our unique cloth folding dataset, we provide 8.5 hours of demonstrations from multiple perspectives leading to 1,000 folding samples of different types of textiles. The demonstrations are recorded in multiple public places, in different conditions with a diverse set of people. Our dataset consists of anonymized RGB images, depth frames, skeleton keypoint trajectories, and object labels. The dataset is available on this GitHub repository.
Video dataset of human demonstrations of folding clothing for robotic folding
General-purpose clothes-folding robots do not yet exist owing to the deformable nature of textiles, making it hard to engineer manipulation pipelines or learn this task. In order to accelerate research for the learning of the robotic clothes folding task, we introduce a video dataset of human folding demonstrations. In total, we provide 8.5 hours of demonstrations from multiple perspectives leading to 1,000 folding samples of different types of textiles. The demonstrations are recorded in multiple public places, in different conditions with a diverse set of people. Our dataset consists of anonymized RGB images, depth frames, skeleton keypoint trajectories, and object labels. In this article, we describe our recording setup, the data format, and utility scripts, which can be accessed at https://adverley.github.io/folding-demonstrations.
Hyperspectral camera dataset
This dataset entails a close range hyperspectral camera dataset with high temporal resolution of strawberry with eco-physiological data of one leaf. Fully available on Zenodo.
Limitations of snapshot hyperspectral cameras to monitor plant response dynamics in stress-free conditions
Pieters, Olivier,
De Swaef, Tom,
Lootens, Peter,
Stock, Michiel,
Roldán-Ruiz, Isabel,
and wyffels, Francis
Plants’ dynamic eco-physiological responses are vital to their productivity in continuously fluctuating conditions, such as those in agricultural fields. However, it is currently still very difficult to capture these responses at the field scale for phenotyping purposes. Advanced hyperspectral imaging tools are increasingly used in phenotyping, and have been applied to detect changes in plants in response to a specific treatment, phenological state or monitor its growth and development. Phenotyping has to evolve towards capturing dynamic behaviour under more subtle fluctuations in environmental conditions, without the presence of clear treatments or stresses. Therefore, we investigated the potential of hyperspectral imaging to capture dynamic behaviour of plants in stress-free conditions at a temporal resolution of seconds. Two growth chamber experiments were set up, in which strawberry plants and four different background materials, serving as controls, were monitored by a snapshot hyperspectral camera in variable conditions of light, temperature and relative humidity. The sampling period was set to three seconds, triggering image acquisition and gas exchange measurements. Different background materials were used to assess the influence of the environment and the camera in both experiments. To separate the plant and background data, static masks were determined. Two datasets were created, which encompass both experiments. One dataset was constructed after averaging over the entire mask to acquire one value per spectral band. These values were then used to calculate a set of vegetation indices. The other dataset used spatial subsampling to retain spatial information. From both datasets, linear models were constructed using ridge regression, which estimated the measured eco-physiological and environmental data. Leaf temperature and vapour pressure deficit based on leaf temperature are the two main eco-physiological characteristics that could be predicted successfully. Stomatal conductance, photosynthesis and transpiration rate show less promising results. We suspect that limited variation, and low spectral resolution and range are the main causes of the inability of the models to extract meaningful predictions. Furthermore, the models that were only trained on background data also showed good predictive performance. This is probably because the main drivers for good performing eco-physiological variables are temperature and incident light intensity. Environmental characteristics that have good performance are photosynthetically active radiation and air temperature. Current hyperspectral sensing technologies are not yet able to uncover most plant dynamic eco-physiological responses when plants are cultivated in stress-free conditions.
DIMO: Dataset of Industrial Metal Objects
The DIMO dataset is a diverse set of industrial metal objects. These objects are symmetric, textureless and highly reflective, leading to challenging conditions not captured in existing datasets. Our 6D object pose estimation dataset contains both real-world and synthetic images. Real-world data is obtained by recording multi-view images of scenes with varying object shapes, materials, carriers, compositions and lighting conditions. Our dataset contains 31,200 images of 600 real-world scenes and 553,800 images of 42,600 synthetic scenes, stored in a unified format. The close correspondence between synthetic and real-world data, and controlled variations, will facilitate sim-to-real research. Our dataset's size and challenging nature will facilitate research on various computer vision tasks involving reflective materials. The full dataset and labeling tool is available on this GitHub repository.
Sim-to-real dataset of industrial metal objects
De Roovere, Peter,
Moonen, Steven,
Michiels, Nick,
and wyffels, Francis
We present a diverse dataset of industrial metal objects with unique characteristics such as symmetry, texturelessness, and high reflectiveness. These features introduce challenging conditions that are not captured in existing datasets. Our dataset comprises both real-world and synthetic multi-view RGB images with 6D object pose labels. Real-world data were obtained by recording multi-view images of scenes with varying object shapes, materials, carriers, compositions, and lighting conditions. This resulted in over 30,000 real-world images. We introduce a new public tool that enables the quick annotation of 6D object pose labels in multi-view images. This tool was used to provide 6D object pose labels for all real-world images. Synthetic data were generated by carefully simulating real-world conditions and varying them in a controlled and realistic way. This resulted in over 500,000 synthetic images. The close correspondence between synthetic and real-world data and controlled variations will facilitate sim-to-real research. Our focus on industrial conditions and objects will facilitate research on computer vision tasks, such as 6D object pose estimation, which are relevant for many industrial applications, such as machine tending. The dataset and accompanying resources are available on the project website.
Almost-Ready-To-Fold (aRTF) Clothes Dataset
We have created a dataset of almost-flattened clothes in which we mimic the output of existing robotic unfolding systems by applying small deformations to the clothes. The dataset contains a total of 1,896 images of 100 different garments in 14 different real-world scenes. We refer to this dataset as the aRTF Clothes dataset, for almost-ready-to-fold clothes. There have already been some efforts to collect datasets for robotic cloth manipulation, but to the best of our knowledge, this dataset is the first to provide annotated real-world of almost-flattened cloth items in household settings. The dataset is available on this GitHub repository.
Learning keypoints for robotic cloth manipulation using synthetic data
Assistive robots should be able to wash, fold or iron clothes. However, due to the variety, deformability and self-occlusions of clothes, creating robot systems for cloth manipulation is challenging. Synthetic data is a promising direction to improve generalization, but the sim-to-real gap limits its effectiveness. To advance the use of synthetic data for cloth manipulation tasks such as robotic folding, we present a synthetic data pipeline to train keypoint detectors for almost-flattened cloth items. To evaluate its performance, we have also collected a real-world dataset. We train detectors for both T-shirts, towels and shorts and obtain an average precision of 64% and an average keypoint distance of 18 pixels. Fine-tuning on real-world data improves performance to 74% mAP and an average distance of only 9 pixels. Furthermore, we describe failure modes of the keypoint detectors and compare different approaches to obtain cloth meshes and materials. We also quantify the remaining sim-to-real gap and argue that further improvements to the fidelity of cloth assets will be required to further reduce this gap. The code, dataset and trained models are available here.
Cloth Manipulation Competition Dataset and Code
Regrasping garments in the air is a popular cloth unfolding strategy. To get started, it’s common to grasp the lowest point. However, better grasp points are needed to unfold the garment completely, such as the shoulders of a shirt or the waist of shorts. The objective of the ICRA2024 cloth manipulation competition was to localise good grasp poses in colour and depth images of garments held in the air by one gripper. We evaluated the predicted grasps by executing it on a dual UR5e setup live at ICRA. The cloth items that must be grasped will be towels, shirts and shorts. The dataset, code and details are available on this page.
The ICRA 2024 cloth competition : benchmarking robotic cloth unfolding
De Gusseme, Victor-Louis,
Proesmans, Remko,
and wyffels, Francis
In 40th Anniversary of the IEEE International Conference on Robotics and Automation (ICRA@40), Abstracts
2024
Unfolding cloth in the air is a fundamental task in robotic cloth manipulation, essential for various subsequent tasks like folding, hanging, or even assisted dressing. It has been extensively studied, yet, until now a standardized performance comparison across many different labs and methods has never been done. The ICRA 2024 Cloth Competition marks a significant milestone by offering a head-to-head evaluation of 11 diverse teams on a shared real-world robotic platform. This unprecedented event not only sets a new benchmark for the field but also fosters collaboration and innovation, while providing a comprehensive dataset to drive future research towards more robust and generalizable solutions in robotic cloth manipulation.
Glasses-in-the-Wild Dataset
The Glasses-in-the-Wild dataset is a collection of 1,000 RGB images of transparent and partially filled glasses, captured in diverse real-world environments. It was crowdsourced from 11 participants and includes 93 unique glass types across 60 different scenes with varying backgrounds, lighting conditions, reflections, occlusions, and distractors.
Each image is annotated with bounding boxes and semantically meaningful keypoints, including the rim, base, and liquid level, to facilitate the training of models for transparent object detection and liquid level estimation. The dataset contains a broad distribution of liquid levels: 24.3% of glasses are empty, while 75.7% contain liquid, with an average fill of 48%.
This dataset complements existing transparent object datasets by providing a wider variety of glass shapes, sizes, colors, and real-world conditions, supporting robust training for robotic perception systems and other computer vision applications involving transparent containers.
The dataset is available on Zenodo.
Software
AIRO-MONO Repo
The AIRO-MONO Repo provides ready-to-use Python packages to accelerate the development of robotic manipulation systems. The goals of this repo are to
facilitate our research and development in robotic perception and control
facilitate the creation of demos/applications such as the folding competitions by providing either wrappers to existing libraries or implementations for common functionalities and operations.
Pytorch-Hebbian
A lightweight framework for Hebbian learning based on PyTorch Ignite. Can be found on GitHub. Presented at the Beyond Backpropagation NeurIPS 2020 worskhop.
PyTorch-Hebbian : facilitating local learning in a deep learning framework
Talloen, Jules,
Vandesompele, Alexander,
and Dambre, Joni
In NeurIPS 2020 Beyond backpropagation Workshop
2020
Gloxinia is a new modular data logging platform, optimised for real-time physiological measurements. The system is composed of modular sensor boards that are interconnected depending upon the needs, with the potential to scale to hundreds of sensors in a distributed sensor system. Two different types of sensor boards were designed: one for single-ended measurements and one for lock-in amplifier based measurements, named Sylvatica and Planalta respectively. Both hardware and software have been open-sourced on GitHub.
Gloxinia—an open-source sensing platform to monitor the dynamic responses of plants
Pieters, Olivier,
De Swaef, Tom,
Lootens, Peter,
Stock, Michiel,
Roldán-Ruiz, Isabel,
and wyffels, Francis
The study of the dynamic responses of plants to short-term environmental changes is becoming increasingly important in basic plant science, phenotyping, breeding, crop management, and modelling. These short-term variations are crucial in plant adaptation to new environments and, consequently, in plant fitness and productivity. Scalable, versatile, accurate, and low-cost data-logging solutions are necessary to advance these fields and complement existing sensing platforms such as high-throughput phenotyping. However, current data logging and sensing platforms do not meet the requirements to monitor these responses. Therefore, a new modular data logging platform was designed, named Gloxinia. Different sensor boards are interconnected depending upon the needs, with the potential to scale to hundreds of sensors in a distributed sensor system. To demonstrate the architecture, two sensor boards were designed—one for single-ended measurements and one for lock-in amplifier based measurements, named Sylvatica and Planalta, respectively. To evaluate the performance of the system in small setups, a small-scale trial was conducted in a growth chamber. Expected plant dynamics were successfully captured, indicating proper operation of the system. Though a large scale trial was not performed, we expect the system to scale very well to larger setups. Additionally, the platform is open-source, enabling other users to easily build upon our work and perform application-specific optimisations.
MIRRA
MIRRA is a modular and cost-effective microclimate monitoring system for real-time remote applications that require low data rates. MIRRA is modular, enabling the use of different sensors (e.g., air and soil temperature, soil moisture and radiation) depending upon the application, and uses an innovative node system highly suitable for remote locations. The entire system is open source and can be constructed and programmed from the design and code files, hosted on GitHub.
MIRRA : a modular and cost-effective microclimate monitoring system for real-time remote applications
Pieters, Olivier,
Deprost, Emiel,
Van Der Donckt, Jonas,
Brosens, Lore,
Sanczuk, Pieter,
Vangansbeke, Pieter,
De Swaef, Tom,
De Frenne, Pieter,
and wyffels, Francis
Monitoring climate change, and its impacts on ecological, agricultural, and other societal systems, is often based on temperature data derived from official weather stations. Yet, these data do not capture most microclimates, influenced by soil, vegetation and topography, operating at spatial scales relevant to the majority of organisms on Earth. Detecting and attributing climate change impacts with confidence and certainty will only be possible by a better quantification of temperature changes in forests, croplands, mountains, shrublands, and other remote habitats. There is an urgent need for a novel, miniature and simple device filling the gap between low-cost devices with manual data download (no instantaneous data) and high-end, expensive weather stations with real-time data access. Here, we develop an integrative real-time monitoring system for microclimate measurements: MIRRA (Microclimate Instrument for Real-time Remote Applications) to tackle this problem. The goal of this platform is the design of a miniature and simple instrument for near instantaneous, long-term and remote measurements of microclimates. To that end, we optimised power consumption and transfer data using a cellular uplink. MIRRA is modular, enabling the use of different sensors (e.g., air and soil temperature, soil moisture and radiation) depending upon the application, and uses an innovative node system highly suitable for remote locations. Data from separate sensor modules are wirelessly sent to a gateway, thus avoiding the drawbacks of cables. With this sensor technology for the long-term, low-cost, real-time and remote sensing of microclimates, we lay the foundation and open a wide range of possibilities to map microclimates in different ecosystems, feeding a next generation of models. MIRRA is, however, not limited to microclimate monitoring thanks to its modular and wireless design. Within limits, it is suitable or any application requiring real-time data logging of power-efficient sensors over long periods of time. We compare the performance of this system to a reference system in real-world conditions in the field, indicating excellent correlation with data collected by established data loggers. This proof-of-concept forms an important foundation to creating the next version of MIRRA, fit for large scale deployment and possible commercialisation. In conclusion, we developed a novel wireless cost-effective sensor system for microclimates.
Smart Textile
The Smart Textile is a wireless, integrated system for instrumentation of cloths. Its sensor matrix detects local pressure variations, mapping them to 8-bit values that can be read using Bluetooth Low Energy. This data can be used to distinguish different states, e.g. folds, of the Smart Textile and thus of any larger cloth it is sewn onto. As such, a new modality for state observation in textile manipulation tasks is obtained. With an output rate of 50+ Hz and a battery life of 15+ hours, the Smart Textile system is aptly suited for in-vivo cobot learning. Both hardware and software are freely available on GitHub.
Modular piezoresistive smart textile for state estimation of cloths
Proesmans, Remko,
Verleysen, Andreas,
Vleugels, Robbe,
Veske-Lepp, Paula,
De Gusseme, Victor-Louis,
and wyffels, Francis
Smart textiles have found numerous applications ranging from health monitoring to smart homes. Their main allure is their flexibility, which allows for seamless integration of sensing in everyday objects like clothing. The application domain also includes robotics; smart textiles have been used to improve human-robot interaction, to solve the problem of state estimation of soft robots, and for state estimation to enable learning of robotic manipulation of textiles. The latter application provides an alternative to computationally expensive vision-based pipelines and we believe it is the key to accelerate robotic learning of textile manipulation. Current smart textiles, however, maintain wired connections to external units, which impedes robotic manipulation, and lack modularity to facilitate state estimation of large cloths. In this work, we propose an open-source, fully wireless, highly flexible, light, and modular version of a piezoresistive smart textile. Its output stability was experimentally quantified and determined to be sufficient for classification tasks. Its functionality as a state sensor for larger cloths was also verified in a classification task where two of the smart textiles were sewn onto a piece of clothing of which three states are defined. The modular smart textile system was able to recognize these states with average per-class F1-scores ranging from 85.7 to 94.6% with a basic linear classifier.
Halberd
The Halberd coupling is a replacement of the Robotiq I/O Coupling enhanced with a Nina B301 microcontroller. It was created for seamless integration of tactile sensors on Robotiq grippers to enhace the manipulation capabilities of our cobots: no more external power or data cables! Halberd features extensive GPIO, several communication interfaces, and safety circuitry similar to that of the Robotiq I/O Coupling itself. It is Arduino compatible and programmable over a micro USB interface. Both hardware and software are freely available on GitHub.
Seamless integration of tactile sensors for cobots
The development of tactile sensing is expected to enhance robotic systems in
handling complex objects like deformables or reflective materials. However,
readily available industrial grippers generally lack tactile feedback, which
has led researchers to develop their own tactile sensors, resulting in a wide
range of sensor hardware. Reading data from these sensors poses an integration
challenge: either external wires must be routed along the robotic arm, or a
wireless processing unit has to be fixed to the robot, increasing its size. We
have developed a microcontroller-based sensor readout solution that seamlessly
integrates with Robotiq grippers. Our Arduino compatible design takes away a
major part of the integration complexity of tactile sensors and can serve as a
valuable accelerator of research in the field. Design files and installation
instructions can be found at https://github.com/RemkoPr/airo-halberd.