< Back to previous page

Project

Modelling the Human Perception of Immersive Multimedia by means of physiological data

Multimodal, immersive systems aim to emulate the senses by means of omnidirectional visuals, 360° sound, motion tracking and touch simulation to create a feeling of presence in the virtual environment. They have the potential to substitute physical interactions in application domains such as training (Industry 4.0), or e-health (telesurgery). However, the current COVID19 pandemic has shown that they are not ready, as they do not fulfill the requirements in terms of immersiveness. Understanding the effects that immersive applications have on the perception of humans is not straight-forward as it requires the interplay of many factors. Physiological data, such as Electro-encephalograms (EEGs) or Skin Temperature (ST) have shown some potential to assess certain aspects of the user perception (e.g., cybersickness). Despite this, their application to other scenarios and overall perception has yet to be explored. This project proposes a user-centric approach on the assessment and modelling of immersiveness in real-time by means of physiological data.
Date:1 Oct 2022 →  Today
Keywords:Human Perception, Extended Reality, Physiological data
Disciplines:Computer vision, Human-computer interaction, Virtual reality and related simulation