Keynote 1
Title: Towards Volumetric Video Realism in Extended Reality: Challenges and Opportunities
Abstract: Advances in volumetric capture, compression, and rendering techniques have enabled the possibilities of telepresence in an extended reality (XR) environment. Live or pre-recorded volumetric video of an avatar can be streamed over the network and rendered at a client’s XR environment, creating an illusion of spatial co-presence.In this keynote talk, I will first present a case for the importance of visual realism of volumetric video in such scenarios. I will then present existing approaches toward higher visual realism in volumetric video, dividing them into two categories: (i) approaches to achieve smoother motion through temporal up-sampling and (ii) approaches to obtain better details through spatial up-sampling. The former aims to achieve a rendering frame rate that is close to what human brains perceive; while the latter allows users to move closer to an avatar without losing its realism. The talk will also outline the trade-offs and limitations of the current up-sampling approaches. I will conclude the talk with my personal view on the research challenges and opportunities that the research community should confront to achieve a true-to-life XR experience.
Keynote 2
Title: Multisensory Immersive Experiences: From Monitoring of Human Influential Factors to New Applications in Healthcare
Abstract: While virtual and extended reality applications are on the rise, existing experiences are not fully immersive, as only two senses (audio-visual) are typically stimulated. In this keynote talk, I will describe our ongoing work on developing multisensory immersive experiences, which combine auditory, visual, olfactory, and haptic/somatosensory stimuli. I will show the impact that stimulating more senses can have on user quality of experience, sense of presence and immersion, and engagement levels. Moreover, with multisensory experiences, monitoring human influential factors is crucial, as the perception of sensory stimuli can be very subjective (e.g., while a smell can be pleasant for some, it can be unpleasant for others). To this end, I will also describe our work on instrumenting virtual reality headsets with biosensors to allow not only for automated (remote) monitoring of human behaviour and tracking of human influential factors, but to also develop new markers of user experience, such as a multimodal time perception metric or a cybersickness metric. Lastly, I will describe some new applications of multisensory experiences that we are developing for healthcare and well-being. I will start with the use of immersive multisensory nature walks for mental health and describe two ongoing projects, one with patients with post-traumatic stress disorder and another with nurses suffering from burnout. I will conclude with a description of the use of multisensory priming for motor-imagery based neurorehabilitation for stroke survivors.