SDH in immersive environments : can subtitles be immersive at all?
Agulló, Belén (Universitat Autònoma de Barcelona. Transmedia Catalonia Research Group)

Data: 2018
Resum: Immersive environments have been emerging for the past few years and their potential for transforming how entertainment is consumed has raised the interest of the industry and the audience. Many powerful technology companies like Microsoft, Sony, Facebook or HTC are investing in devices such as Microsoft Hololens, PlayStation VR, Oculus Rift or HTC Vive. Also, in recent years we have witnessed an explosion of 360º content (videos, movies, documentaries, news, etc. ) that can be easily accessed through basically any smartphone. However, immersive contents, such as 360º videos or virtual reality video games, pose a challenge for Audiovisual Translation, and even a harder one for Media Accessibility. How can we make immersive contents accessible for everyone and, specifically, for persons with hearing loss? Some filmmakers say that audiences will need to learn a new visual grammar or language to understand how stories in immersive environments are built, as it happened when cuts were introduced in film editing. The same will happen with subtitling in immersive contents. We will need to relearn how to read subtitles, since new dimensions are brought by immersive media, such as directions or three-dimensional space. These new dimensions and the freedom of movement attached to virtual worlds present challenges for users who cannot make use of the audio. If some users are lacking the audio cue, how will they know if someone is speaking at their back or if a sound used as an action trigger is sounding in a different location so that they know they have to turn their head? Can subtitles be used to draw attention to the focus of action? In this presentation, we will review the nature of immersive contents and explain the challenges of implementing subtitles for the deaf and hard-of-hearing (SDH) in 360º videos. Basic subtitling topics such as placement of the subtitles or new features such as speaker identification systems are tricky questions. Some researchers have already raised this issue (Agulló 2018, Brown et al 2018, Fraile et al 2018, Montagud et al 2018, Rothe et al 2018). Regarding placement, different solutions have been suggested, but mainly they can be summed up with two key concepts: dynamic subtitles and static subtitles (or “evenly spaced subtitles” and “following head immediately” subtitles, depending on the terminology used). Basically, dynamic subtitles are those burnt in the video in one or different specific positions. Static subtitles, however, are those linked to the audience field of view and move with the viewers “following their heads immediately” (Brown et al 2018) everywhere they go. The latter implementation is more similar to the integration of traditional subtitles in 2D environments, while the former is more similar to innovative subtitling practices such as creative (Foerster 2010, McClarty 2012, 2014) or integrated subtitles (Fox 2016a, 2016b). As for speaker identification systems, this is a completely new feature for SDH in immersive media. A mechanism needs to be designed to indicate the audience where the speakers are located in the 360º environment, so that they don’t miss the action. Very few researchers have tackled this topic (Rothe et al 2017) and the discussion about which methods are better is still open and will be presented during the session. Also, to illustrate the discussion, we will analyse some examples obtained from a corpus of selected 360º videos. Finally, we will propose a set of features that are worth researching to achieve immersive and integrated subtitles in this new medium, based on criteria of accessibility, immersion and usability. This presentation is related to the research carried out in the European funded projects ImAC (GA: 761974). Also, the author is member of TransMedia Catalonia, an SGR research group funded by 'Secretaria d’Universitats i Recerca del Departament d'Empresa i Coneixement de la Generalitat de Catalunya' (2017SGR113).
Nota: Número d'acord de subvenció EC/H2020/761974
Nota: Número d'acord de subvenció AGAUR/2017/SGR113
Drets: Aquest document està subjecte a una llicència d'ús Creative Commons. Es permet la reproducció total o parcial i la comunicació pública de l'obra, sempre que no sigui amb finalitats comercials, i sempre que es reconegui l'autoria de l'obra original. No es permet la creació d'obres derivades. Creative Commons
Llengua: Anglès.
Document: conferenceObject
Matèria: Subtitles ; Subtitling ; Subtitles for the deaf and hard-of-hearing ; SDH ; Captions ; Immersion ; Virtual reality ; 360º videos ; Cinematic virtual reality ; Accessibility ; Access services
Publicat a: Languages and the Media. International Conference on Language Transfer in Audiovisual Media. Berlin, 12 : 2018

35 p, 2.1 MB

El registre apareix a les col·leccions:
Documents de recerca > Documents dels grups de recerca de la UAB > Centres i grups de recerca (producció científica) > Arts i humanitats > TransMedia Catalonia
Contribucions a jornades i congressos > Ponències i comunicacions

 Registre creat el 2018-10-09, darrera modificació el 2018-10-20

   Favorit i Compartir