At AIRO we value open-source hardware, software and datasets. This page offers a selection of interesting tools and datasets that we offer.


Folding dataset

In our unique cloth folding dataset, we provide 8.5 hours of demonstrations from multiple perspectives leading to 1,000 folding samples of different types of textiles. The demonstrations are recorded in multiple public places, in different conditions with a diverse set of people. Our dataset consists of anonymized RGB images, depth frames, skeleton keypoint trajectories, and object labels. The dataset is available on this GitHub repository.

  1. Video dataset of human demonstrations of folding clothing for robotic folding
    Verleysen, Andreas, Biondina, Matthijs, and wyffels, Francis

Hyperspectral camera dataset

This dataset entails a close range hyperspectral camera dataset with high temporal resolution of strawberry with eco-physiological data of one leaf. Fully available on Zenodo.

  1. Limitations of snapshot hyperspectral cameras to monitor plant response dynamics in stress-free conditions
    Pieters, Olivier, De Swaef, Tom, Lootens, Peter, Stock, Michiel, Roldán-Ruiz, Isabel, and wyffels, Francis

DIMO: Dataset of Industrial Metal Objects

The DIMO dataset is a diverse set of industrial metal objects. These objects are symmetric, textureless and highly reflective, leading to challenging conditions not captured in existing datasets. Our 6D object pose estimation dataset contains both real-world and synthetic images. Real-world data is obtained by recording multi-view images of scenes with varying object shapes, materials, carriers, compositions and lighting conditions. Our dataset contains 31,200 images of 600 real-world scenes and 553,800 images of 42,600 synthetic scenes, stored in a unified format. The close correspondence between synthetic and real-world data, and controlled variations, will facilitate sim-to-real research. Our dataset's size and challenging nature will facilitate research on various computer vision tasks involving reflective materials. The full dataset and labeling tool is available on this GitHub repository.



A lightweight framework for Hebbian learning based on PyTorch Ignite. Can be found on GitHub. Presented at the Beyond Backpropagation NeurIPS 2020 worskhop.

  1. PyTorch-Hebbian : facilitating local learning in a deep learning framework
    Talloen, Jules, Vandesompele, Alexander, and Dambre, Joni
    In NeurIPS 2020 Beyond backpropagation Workshop 2020



Gloxinia is a new modular data logging platform, optimised for real-time physiological measurements. The system is composed of modular sensor boards that are interconnected depending upon the needs, with the potential to scale to hundreds of sensors in a distributed sensor system. Two different types of sensor boards were designed: one for single-ended measurements and one for lock-in amplifier based measurements, named Sylvatica and Planalta respectively. Both hardware and software have been open-sourced on GitHub.

  1. Gloxinia—an open-source sensing platform to monitor the dynamic responses of plants
    Pieters, Olivier, De Swaef, Tom, Lootens, Peter, Stock, Michiel, Roldán-Ruiz, Isabel, and wyffels, Francis
    SENSORS 2020
Gloxinia Dicio board PCB image.


MIRRA is a modular and cost-effective microclimate monitoring system for real-time remote applications that require low data rates. MIRRA is modular, enabling the use of different sensors (e.g., air and soil temperature, soil moisture and radiation) depending upon the application, and uses an innovative node system highly suitable for remote locations. The entire system is open source and can be constructed and programmed from the design and code files, hosted on GitHub.

  1. MIRRA : a modular and cost-effective microclimate monitoring system for real-time remote applications
    Pieters, Olivier, Deprost, Emiel, Van Der Donckt, Jonas, Brosens, Lore, Sanczuk, Pieter, Vangansbeke, Pieter, De Swaef, Tom, De Frenne, Pieter, and wyffels, Francis
    SENSORS 2021
Image of the MIRRA sensor system in a forest.

Smart Textile

The Smart Textile is a wireless, integrated system for instrumentation of cloths. Its sensor matrix detects local pressure variations, mapping them to 8-bit values that can be read using Bluetooth Low Energy. This data can be used to distinguish different states, e.g. folds, of the Smart Textile and thus of any larger cloth it is sewn onto. As such, a new modality for state observation in textile manipulation tasks is obtained. With an output rate of 50+ Hz and a battery life of 15+ hours, the Smart Textile system is aptly suited for in-vivo cobot learning. Both hardware and software are freely available on GitHub.

  1. Modular piezoresistive smart textile for state estimation of cloths
    Proesmans, Remko, Verleysen, Andreas, Vleugels, Robbe, Veske, Paula, De Gusseme, Victor-Louis, and wyffels, Francis
    SENSORS 2022
Robot folding a piece of smart cloth with sensor readout visualisation.