Radar-based hand-gesture recognition

UX
HCI
AI
VR
Radar sensor 
User interface
Machine learning
Gesture recognition

In this project, imec wants to explore mid-air hand gestures to control an interface using radar signals. To this extent, a 140 Ghz radar sensor was developed.

We at imec-mict-UGent are building an experimental setup and feedback system for standardized data-collection of hand-gestures via range-doppler maps. This gesture data will be used to train a gesture classifier based on the collected radar data. To ensure an intuitive user experience and visual feedback we use a custom VR experience to turn a virtual knob.

Partners: imec, IDLab-UA

Duration: 6 months

Contact: Klaas Bombeke