CONBOTS: CONnected through roBOTS

CONBOTS: CONnected through roBOTS: physically coupling humans to boost handwriting and music learning  (2019-2023)


UGent project link: CONBOTS

This project aims to design and test in education scenarios a new class of robots, the CONBOTS, which physically couple people to facilitate learning and improve efficiency during the training of new motor skills. The ultimate goal of CONBOTS is to spread the benefits of connected robotics to education, by promoting an innovative approach to children training, in particular for handwriting and music learning.

To do this, CONBOTS will develop an open, multimodal and modular platform to boost learning in different scenarios by empowering human interaction. The newly designed platform will allow for exchanging physical forces between different users, combining four enabling technologies: i) compact dedicated robotic haptic devices (end-effector robot and exoskeletons) to exchange forces with the upper limbs; ii) an interactive controller based on physical communication integrating differential Game Theory (GT) and an algorithm to identify partner’s goals; iii) a bi-directional user interface capable of exchanging information to the user (Augmented Reality interactive environment and application-driven serious games) and from the user (wearable sensors and instrumented objects); iv) machine learning and data science techniques capable of identifying the users’ physical, mental and emotional status, and change the platform behaviour accordingly, while guaranteeing encryption and privacy of subjects’ information.

CONBOTS builds on recent neuroscientific findings that show the benefits of physical interaction for performing motor tasks together, where the human central nervous system is able to understand a partner’s motor control and use it to improve their own task performance and motor learning. This capability will be implemented onto an innovative modular robotic platform, integrating wearable technology and machine learning techniques to give rise to novel human-human and human-robot interaction paradigms applied in two different learning contexts:

a) Children in their first two years of primary school who are learning to handwrite. In this scenario, we will integrate a robotic system to provide forces at the pen/pencil, a Head Mounted Device for displaying AR games, wearable sensors and instrumented objects for recording user performance. The CONBOTS platform will therefore offer an augmented learning experience that empowers learners’ physical, audio, and visual feedbacks.

b) Children/Beginners who are learning to play violin or drums. This second scenario aims at exploiting the potential of the CONBOTS platform to augment sensorimotor skills during exercises for beginner drum and violin players. The effective combination of a wearable exoskeleton, sensors and an AR system for displaying serious games will enable musicians to exchange forces, thus facilitating the acquisition of those basilar sensorimotor skills that ground musicians’ technical ability.

In these scenarios, the CONBOTS will physically connect users with each other in different modalities of oneon-one interaction: i) Trainer-Learner, where a learner (child) is coupled with a trainer (teacher); ii) LearnerLearner, when two children interact together; iii) Robotic Trainer-Learner, where the learner plays alone, but receives force feedback from a controller that emulates human behaviour.

Promotors: Marc Leman, Pieter-Jan Maes