Brain-inspired and unconventional computing
Biological systems are capable of performing highly complex computations very efficiently using noisy, analogue systems, in which individual parameters are usually not well controlled. This way of computing is radically different from the way we build computing systems: by carefully engineering digital building blocks and assembling them into large systems under the assumption that the behaviour of each building block is highly accurate and reliable.
In this research track, we try to mimic biology in several ways to improve computational efficiency. In part, our work builds upon the field of reservoir computing, which originated as a brain-inspired approach to harness the power of recurrent neural networks without the difficulties of training them. It treats the RNN as pool of spatio-temporal kernels that are used as features in simple linear regression. By extension, this kernel view can also be applied to physical nonlinear dynamical systems.
From this perspective, physical reservoir computing is a way to optimally exploit the computation that naturally occurs in physical systems for specific computations. We apply this paradigm to compute with photonic and memristive systems as a possible route to providing robust nanoscale building blocks for general purpose analogue computing. We also target mechanical systems by exploiting physical body dynamics for closed loop gait and motor pattern generation in compliant robotics. We also use this approach as a stepping stone towards a better understanding of the way biological brains can learn to control biological bodies.
Finally, this research topic also attempts to build bridges between machine learning, hardware optimisation and learning in the brain by applying concepts of biological learning in software neural networks, by applying optimisation techniques from machine learning and biological learning to analogue hardware optimisation and by using knowledge from machine learning in trying to understand learning in the brain.
Staff
Joni Dambre, Francis wyffels
Researchers
Jeroen Burms, Jonas Degrave, Matthias Freiberger, Gabriel Urbain, Alexander Vandesompele
Projects
- IAP project photonics@be - Towards Smart Photonics in 2020
- Neurorobotics subproject of the H2020 Flagship Human Brain Project
- H2020 project PHRESCO - PHotonic Reservoir Computing
- Personal PhD grants from FWO and IWT
Key publications
- “Trainable hardware for dynamical computing using error backpropagation through physical media”, M. Hermans et al., Nature Communications, 2015
- Burms, J., Caluwaerts, K., Dambre, J. (2015). Reward-modulated Hebbian plasticity as leverage for partially embodied control in compliant robotics. FRONTIERS IN NEUROBOTICS. 9(9).
- “Photonic delay systems as machine learning implementations”, M. Hermans et al., Journal of Machine Learning Research, 2015
- “Optoelectronic systems trained with backpropagation through time”, M. Hermans et al., IEEE Transactions on Neural Networks and Learning Systems, 2015
- “Memristor models for machine learning”, J. P. Carbajal et al., Neural Computation, 2015
- "Frequency modulation of large oscillatory neural networks”, Francis wyffels, Jiwen Li, Tim Waegeman, Benjamin Schrauwen and Herbert Jaeger (2014), BIOLOGICAL CYBERNETICS. 108(2). P.145-157
- “Automated design of complex dynamic systems”, M. Hermans et al., Plos One, 2014
- “Experimental demonstration of reservoir computing on a silicon photonics chip”, K. Vandoorne et al., Nature Communications, 2014