Art and Science Interaction Lab

The Art and Science Interaction Lab is a highly flexible and modular research facility to effectively bring, analyse and test experiences and interactions in virtual contexts as well as to conduct research on next-gen immersive technologies. It brings together the expertise and creativity of engineers, performers, designers and scientists creating solutions and experiences shaping the lives of people. The lab has been equipped with state-of-the-art visual, auditory and user-tracking equipment, fully synchronized and connected to a central backend. This synchronization allows for highly accurate multi-sensor measurements and analysis.

The lab itself is a room of 10m x 10m and spans the height of two floors (~6m). It is equipped with 82 individual speakers connected to a fully IP based audio distribution. The system is capable of delivering highly realistic spatial audio projection by making use of the connected audio wavefield processor (Barco IOSONO). Furthermore, this audio system allows for accurate recreation and simulation of room acoustics. In terms of visual modalities, the lab has been equipped with 2 fully untethered HTC Vive Pro Eye and 2 tethered HTC Vive Pro devices, allowing free roaming spanning the full 10m x 10m area and allowing for multi person VR. A 7m x 4m audio transparent projection screen in combination with a 12.000 lumen 4K projector delivers compelling and high-end immersive visualization possibilities. Both the audio and visual systems are connected to a powerful state-of-the-art backend.

In terms of sensors, the Art and Science Interaction Lab has been equipped with a Qualisys motion capturing system consisting out of 18 infrared and 5 RGB camera’s, delivering a high framerate (>= 120fps) multi-person motion capture with < 1 mm3 accuracy. This system also allows to track granular movements (e.g. fingers while playing piano or face expressions). Furthermore, two untethered clinical grade EEG headsets allow for dual 64 channel EEG measurements. The shielded cap with active electrodes allow to wear the EEG headsets underneath a VR headset with minimal interference. Lastly, EMG, skin conductance and eye-tracking deliver the last pieces of the multi-sensory measurement system in the Art and Science Interaction Lab.

The Art and Science Interaction Lab team supports innovation in different key domains. In general, the team focuses on interaction research in virtualized environments, unravelling complex user interactions and experiences in order to design and create novel applications and interfaces. The application domains span from smart home appliances, health, safety, smart public places to more artistic and creative applications. Furthermore, the infrastructure is a key research infrastructure used for more fundamental research on virtual reality technologies (e.g. auralisation, virtual acoustics, 6 degrees of freedom VR, multi-person VR…)

The team is an interdisciplinary consortium joining the expertise of three Ghent University research groups (IDLab, the Institute of Psychoacoustics and Electronic Music and the Research group for Media, Innovation and Communication Technologies) and has been funded under the medium scale research infrastructure program governed by the Research Foundation Flanders (FWO). As such, the Art and Science Interaction Lab is a one of its kind research facility targeting both industry and academia and delivers an interdisciplinary approach in measuring, analyzing and creating our next generation appliances, interfaces and experiences. 

People

  • Peter Lambert, Glenn Van Wallendael, Niels Van Kets.

Pictures

Art and Science Interaction Lab

Art and Science Interaction Lab

Art and Science Interaction Lab

Art and Science Interaction Lab