We are happy to announce that we are going to attend this year’s QoMEX 2012, Yarra, Melbourne (5.-7. July).
At the QoMEX we will present the following two papers:
Title: Investigating the Impact of Sensory Effects on the Quality of Experience and Emotional Response in Web Videos
Authors: B. Rainer, M. Waltl, E. Cheng, M. Shujau, C. Timmerer, S. Davis, I. Burnett, C. Ritz, H. Hellwagner
Multimedia is ubiquitously available online with large amounts of video increasingly consumed through Web sites such as YouTube or Google Video. However, online multimedia typically limits users to visual/auditory stimulus, with onscreen visual media accompanied by audio. The recent introduction of MPEG-V proposed multi-sensory user experiences in multimedia environments, such as enriching video content with so-called sensory effects like wind, vibration, light, etc. In MPEG-V, these sensory effects are represented as Sensory Effect Metadata (SEM), which is additionally associated to the multimedia content. This paper presents three user studies that utilize the sensory effects framework of MPEG-V, investigating the emotional response of users and enhancement of Quality of Experience (QoE) of Web video sequences from a range of genres with and without sensory effects. In particular, the user studies were conducted in Austria and Australia to investigate whether geography and cultural differences affect users’ elicited emotional responses and QoE.
Title: Sensory Effect Dataset and Test Setups
Authors: M. Waltl, B. Rainer, C. Timmerer, H. Hellwagner
Additional constituents for the representation of multimedia content gained more and more attention. For example, the amount of cinemas equipped with additional devices (e.g., ambient light, vibrating seats, wind generators, water sprayers, heater/coolers) that stimulate senses going beyond audition and vision increases. On the content side the MPEG-V standard specifies – among others – Sensory Effect Metadata (SEM) which provides means to describe sensory effects such as wind, vibration, light, etc. to be attached to audio-visual content and, thus, offering an enhanced and immersive experience for the user. However, there is a lack of a common set of test content allowing for various subjective user studies and verification across different test sites. In this paper we provide our dataset comprising a number of videos from different genres enriched with MPEG-V compliant Sensory Effect Metadata descriptions. Furthermore, we describe possible test setups using off-the-shelf hardware for conducting subjective quality assessments.
Looking forward to see you in Melbourne.