IPEM-ASIL: Vision

Augmented Humanities

The vision of IPEM is integrated into the larger framework of ‘Augmented Humanities’. The Augmented Humanities framework aims to generate new impulses within the humanities through the integration of technology in research and applications. By now everyone is familiar with the digital humanities with its emphasis on databases, machine learning and artificial intelligence. But there are also enormous possibilities in the field of interaction, virtual & mixed reality, biofeedback, and hyperscanning. It is certain that various technologies have a major impact on human meaning formation and cultural/educational development. By integrating these technologies into its methodologies, the humanities can "elevate" itself and help drive developments. Hence the name augmented humanities. 

Rooted in the Augmented Humanities framework, IPEM’s research and development supports on four pillars of creative-artistic innovation.

A Vision: Four pillars of creative-artistic innovation

The cross-fertilization of SCIENCE, TECHNOLOGY, and ART PRACTICE lies at the heart of the creative-artistic innovations targeted in IPEMs research program. This approach is firmly anchored into the DNA of IPEM, that in its beginnings (1963-1986) took a similar approach in establishing a pioneering role in electronic music production, with composers such as Louis De Meester, Karel Goeyvaerts and Lucien Goethals. New times, new tools, … same spirit …

NO PRODUCTION but ARTISTIC RESEARCH

Creative-artistic innovation at IPEM is focused on artistic research as a resource for developing new forms of artistic expression, including innovative artistic settings (such as performance halls). Artistic research at IPEM can culminate in artistic productions with one of our privileged production partners, such as Vooruit, Bijloke, Minard, and BOZAR. While IPEM is not a production center, small public demonstrators (with audiences <25 people) are key in IPEM's valorization policy.

SCIENCE

In developing new artistic experiences, we focus on two aspects. A first aspect pertains to the idea that musical expression, communication and sense-making are active and dynamic processes, rooted in expressive gestures and bodily interactions. A second related aspect pertains to the power of music to connect people, and to create strong feelings of togetherness and shared understanding and identity. These two aspects are at the core of IPEMs pioneering research on embodied music cognition and interaction (Leman, 2007; Lesaffre, Maes, & Leman, 2017) leading to theoretical and empirical outcomes. These outcomes provide a solid ground to reflect on how new music technologies can extent the creative-expressive potential of bodily movement, and enrich human interaction through music.

TECHNOLOGY

New technologies – from the physical to the digital – have always led to new forms of musical creation, experience, composition practices and aesthetics. In the ASIL, we focus on three realms of emerging technologies to drive creative-artistic innovation:

  1. Augmented reality: The ASIL provides a 62-speaker setup that allows to create (moving) 3D sound sources (e.g., wavefield synthesis and ambisonics), and to simulate room acoustics (auralisation). This allows realizing a shift from a focus on musical instruments and compositions, towards the creative-artistic use of musical environments, with which people can interact expressively.
  2. Measurement of human movement and physiology: To allow humans to interact with and within immersive musical environments, the ASIL provides various technologies to accurately track movement (16 Qualisys – OQUS 7+, force plates, eye-tracking, custom-made sensors, etc.), and (neuro)physiological activity (EKG, EDA, pupillary response, EEG, and FNIRS).
  3. Artificial intelligence: An important aim is to explore how musical environments could be made “smart”, in the sense that they are responsive, adaptive, and anticipative to users’ (coordinative) behavior and mind states, or in the sense that they incorporate strategies to induce specific behaviors or mind states.

Apart from audio and music technologies, the ASIL provides technologies for virtual/augmented/mixed reality in the visual domain (4K video projection, HTC Vive Pro, etc.).

ART PRACTICE

The creative input from art practice is a keystone for the development of new musical practices, experiences and scenarios, rooted in the insights from science and the possibilities offered by emerging technologies. Art practice encompasses musicians, composers, sound artists, performance artists, multimedia artists, dancers, etc. The ASIL positions itself specifically as an art and science incubator in which new creative-artistic practices can be explored, developed, and tested (read, not a production studio). However, there is always a close and mutual link with cultural centers.

Topics and themes

EXPRESSIVE 3D SOUND OBJECT TRAJECTORIES:

In its essence, spatialisation of sound objects exists in the definition of spatial XYZ coordinates through time. Typically, this is done with a computer mouse and keyboard, making it a labor-intensive and dull job, devoid of any expression. Currently, we explore how we can use expressive body movements to design spatiotemporal trajectories of sound sources moving in 3D space (3Dsound-movement mapping strategies). This opens new avenues for creative collaborations between sound artists and for instance dancers.

AUGMENTED INTERACTION THROUGH 3D SOUND OBJECTS:

Moving 3D virtual sound objects provide new means to connect people, and to drive expressive interactions between them. The realization of creative scenarios require that people can manipulate the 3D virtual sound objects in real time, by using motion capture technologies. In addition, we aim exploring how we can “project” the 3D sound objects onto physical objects or visual objects (with potentially tactile feedback) created through virtual/augmented reality techniques.

AURALISATION:

Auralisation is a technique to model and simulate room acoustics (sound propagation), and the subjective experience of its audible characteristics at any given listener’s position. For this, we use the EVERTims 3D auralisation software framework (http://evertims.github.io/). The current technological challenge is to extent the technique beyond binaural headphone reproduction to ambisonics playback on the ASILs 62 speaker system. This opens a new box of possibilities for immersive experiences and creative context creation.

BINAURAL – AMBISONICS HYBRID PLAYBACK:

Higher-order ambisonics allows to create virtual 3D sound objects in space. 3D sound objects that are located within the listening area (that is, inside the loudspeaker setup) are called ‘focused sources'. Due to various reasons, the creation of these focused sources is problematic and often fails, so that it hampers the creation of realistic musical environments. Therefore, we aim at creating a hybrid between binaural and ambisonics methods to solve this problem. As an outcome, we will be able to create sound sources that are optimally positioned from far away of the listener to the inside of the listeners’ head. Such an effective system provides many new creative-artistic possibilities to sound artists.

SONIFICATION and MUSIC-BASED BIOFEEDBACK:

Sonification pertains to the translation of movement or other dynamic system into sounds. Sonification can be used to change ongoing music-tracks and use them in real-time biofeedback systems that influence human behavior. Examples are biofeedback systems that work with monitored human movement, or with minotored human brain activity. Music-based biofeedback systems capture the quality of ongoing interactions among people and provide immediate real-time feedback using automated (3D) sound manipulations.