I am a professor at Ghent University. My main field field since 2008 is machine learning, with a focus on deep learning and neural networks. However, I am particularly interested in cross-disciplinary innovative research. Before switching to machine learning, I spent quite a long time in the field of digital electronics and the study of interconnection complexity in such systems, as well as in the rather specific field of (physical) reservoir computing, which aims to maximally exploit the natural dynamics of (untrained) recurrent neural networks or physical dynamical systems for computation. My research in that area was mostly applied to photonic systems and robotics. Through my recent involvement in the Human Brain project, I got acquainted with the field of computational neuroscience and in particular models for computation and learning in the brain.
In my current research, I combine experience from these different fields, as well as exploring new ones. I am most interested in using brain- and biologically inspired computing to improve the efficiency of neural networks and artificial learning systems. Two types of efficiency are targeted: data-efficiency during learning and power efficiency during inference. In simple terms: I want to find ways to solve machine learning tasks using smaller or more power-efficient neural networks and based on less labeled data. Our brain is more efficient in both respects, and I’m convinced there is much more we can learn and transfer to artificial neural systems.
In practice, current research is organised around several sub-topics and sub-goals:
- the exploration of resource-efficiency within the current framework of deep learning (using backpropagation), mostly addressing image processing tasks - projects Hyperscales (FWO), Creative (icon) and PhD grant(s)
- the introduction of a sense of ‘embodiment’ in learning to understand sign language, a very human and highly embodied task - project SignON (EC) and PhD grant(s)
- the introduction and evaluation of biologically inspired unsupervised or semi-supervised learning into traditional (deep) neural network architectures - project SmartNets (EC) and PhD Grants
- the development of novel computing paradigms that more efficiently exploit physical hardware properties and dynamics for computations on analog sensor data - project PostDigital (EC) and PhD grants