Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3084289.3089915acmconferencesArticle/Chapter ViewAbstractPublication PagesimxConference Proceedingsconference-collections
short-paper

Subtitles in 360-degree Video

Published: 14 June 2017 Publication History

Abstract

Currently there exists no agreed-upon user experience guidelines regarding subtitling (closed captions) in immersive 360-degree video experiences. It is not clear how subtitles might be acceptably displayed within this context, namely to support the balance between comprehension, freedom to look around the scene, and immersion. This work-in-progress describes four subtitle behaviours that we have designed and implemented in order to perform user-testing. We describe our rationale for each behaviour and discuss our initial hypotheses surrounding a full empirical investigation.

Supplementary Material

suppl.mov (tvxwp0101-file3.mov)
Supplemental video

References

[1]
Klin A, Jones W, Schultz R, Volkmar F, and Cohen D. 2002. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Archives of General Psychiatry 59, 9 (2002), 809--816.
[2]
Andy Brown, Rhia Jones, Mike Crabb, James Sandford, Matthew Brooks, Mike Armstrong, and Caroline Jay. 2015. Dynamic Subtitles: The User Experience. In Proc. TVX '15.
[3]
Julie N. Buchan, Martin Par, and Kevin G. Munhall. 2007. Spatial statistics of gaze fixations during dynamic face processing. Social Neuroscience 2, 1 (2007), 1--13. 18633803.
[4]
V. Charissis and M. Naef. 2007. Evaluation of Prototype Automotive Head-Up Display Interface: Testing Driver's Focusing Ability through a VR Simulation. In 2007 IEEE Intelligent Vehicles Symposium. 560--565.
[5]
British Broadcasting Corporation. 2016. Subtitle Guidelines v1.1.4. (2016). http://bbc.github.io/subtitle-guidelines
[6]
Antoine Coutrot and Nathalie Guyader. 2014. How saliency, faces, and sound influence gaze in dynamic social scenes. Journal of Vision 14, 8 (2014), 5.
[7]
Michael Crabb, Rhianne Jones, and Mike Armstrong. 2015. The Development of a Framework for Understanding the UX of Subtitles. In Proc. ASSETS '15. ACM.
[8]
A Foerster. 2010. Towards a creative approach in subtitling: a case study. In New Insights into Audiovisual Translation and Media Accessibility, J.D. Cintas, A. Matamala, and J. Neves (Eds.). Rodopi, 81--98.
[9]
Tom Foulsham and Lucy Anne Sanderson. 2013. Look who's talking? Sound changes gaze behaviour in a dynamic social scene. Visual Cognition 21, 7 (2013), 922--944.
[10]
Ian Hamilton. 2015. How to do subtitles well -- basics and good practices. (2015). http://www.gamasutra.com/ blogs/IanHamilton/20150715/248571/How_to_do_subtitles_ well__basics_and_good_practices.php.
[11]
Richang Hong, Meng Wang, Xiao-Tong Yuan, Mengdi Xu, Jianguo Jiang, Shuicheng Yan, and Tat-Seng Chua. 2011. Video Accessibility Enhancement for Hearing-impaired Users. ACM Trans. Multimedia Comput. Commun. Appl. 7S, 1, Article 24 (Nov. 2011), 19 pages.
[12]
Yongtao Hu, Jan Kautz, Yizhou Yu, and Wenping Wang. 2015. Speaker-Following Video Subtitles. ACM Trans. Multimedia Comput. Commun. Appl. 11, 2, Article 32 (Jan. 2015), 17 pages.
[13]
Charissa R. Lansing and George W. McConkie. 2003. Word identification and eye fixation locations in visual and visual-plus-auditory presentations of spoken sentences. Perception & Psychophysics 65, 4 (2003), 536--552.
[14]
Joseph J. LaViola, Jr. 2000. A Discussion of Cybersickness in Virtual Environments. SIGCHI Bull. 32, 1 (Jan. 2000), 47--56.
[15]
Ofcom. 2006. Television access services. Review of the Code and guidance. (2006). https://www.ofcom.org.uk/__data/assets/pdf_file/0016/42442/access.pdf, last accessed March 2017.
[16]
Jason Orlosky, Kiyoshi Kiyokawa, and Haruo Takemura. 2014. Managing Mobile Text in Head Mounted Displays: Studies on Visual Preference and Text Placement. SIGMOBILE Mob. Comput. Commun. Rev. 18, 2 (June 2014), 20--31.
[17]
Peter J. Passmore, Maxine Glancy, Adam Philpot, Amelia Roscoe, Andrew Wood, and Bob Fields. 2016. Effects of Viewing Condition on User Experience of Panoramic Video. In Proc. ICAT-EGVE '16.
[18]
Adam Philpot, Maxine Glancy, Peter J Passmore, Andrew Woods, and Bob Fields. User Experience of Panoramic Video in CAVE-like and Head Mounted Display Viewing Conditions. In Accepted for TVX '17.
[19]
Raisa Rashid, Jonathan Aitken, and DeborahI. Fels. 2006. Expressing Emotions Using Animated Text Captions. In Computers Helping People with Special Needs, Klaus Miesenberger, Joachim Klaus, WolfgangL. Zagler, and ArthurI. Karshmer (Eds.). Lecture Notes in Computer Science, Vol. 4061. Springer Berlin Heidelberg, 24--31.
[20]
Gary E. Riccio and Thomas A. Stoffregen. 1991. An ecological Theory of Motion Sickness and Postural Instability. Ecological Psychology 3, 3 (1991), 195--240.
[21]
Alina Secarâ. 2013. R U ready 4 new subtitles? Investigating the potential of social translation practices and creative spellings. Linguistica Antverpiensia, New Series' Themes in Translation Studies 0, 10 (2013). https: //lans.ua.ac.be/index.php/LANS-TTS/article/view/282
[22]
Sarah Sharples, Sue Cobb, Amanda Moody, and John R Wilson. 2008. Virtual reality induced symptoms and effects (VRISE): Comparison of head mounted display (HMD), desktop and projection display systems. Displays 29, 2 (2008), 58--69.
[23]
Alia Sheikh, Andy Brown, Zillah Watson, and Michael Evans. Directing attention in 360-degree video. In Proc. IBC '16.
[24]
Guanghan Song, Denis Pellerin, and Lionel Granjon. 2013. Different types of sounds influence gaze differently in videos. Journal of Eye Movement Research 6, 4 (2013). https://bop.unibe.ch/index.php/JEMR/article/view/2360
[25]
The New York Times. 2015. The Displaced | 360 VR Video | The New York Times. (2015). https://www.youtube.com/watch?v=ecavbpCuvkI
[26]
T. Vigier, Y. Baveye, J. Rousseau, and P. Le Callet. 2016. Visual attention as a dimension of QoE: Subtitles in UHD videos. In 2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX). 1--6.

Cited By

View all
  • (2023)Timeline Exploration in 360° VideoProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614533(1-12)Online publication date: 13-Oct-2023
  • (2023)Head-anchored text placements and cognitive load in information-rich virtual environmentsProceedings of Mensch und Computer 202310.1145/3603555.3603575(27-36)Online publication date: 3-Sep-2023
  • (2023)Accessibility Research in Digital Audiovisual Media: What Has Been Achieved and What Should Be Done Next?Proceedings of the 2023 ACM International Conference on Interactive Media Experiences10.1145/3573381.3596159(94-114)Online publication date: 12-Jun-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
TVX '17 Adjunct: Adjunct Publication of the 2017 ACM International Conference on Interactive Experiences for TV and Online Video
June 2017
144 pages
ISBN:9781450350235
DOI:10.1145/3084289
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 360-degree video
  2. accessibility
  3. hci.
  4. subtitles
  5. user experience
  6. vr

Qualifiers

  • Short-paper

Conference

TVX '17
Sponsor:

Acceptance Rates

Overall Acceptance Rate 69 of 245 submissions, 28%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)64
  • Downloads (Last 6 weeks)7
Reflects downloads up to 12 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Timeline Exploration in 360° VideoProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614533(1-12)Online publication date: 13-Oct-2023
  • (2023)Head-anchored text placements and cognitive load in information-rich virtual environmentsProceedings of Mensch und Computer 202310.1145/3603555.3603575(27-36)Online publication date: 3-Sep-2023
  • (2023)Accessibility Research in Digital Audiovisual Media: What Has Been Achieved and What Should Be Done Next?Proceedings of the 2023 ACM International Conference on Interactive Media Experiences10.1145/3573381.3596159(94-114)Online publication date: 12-Jun-2023
  • (2023)Subtitles in VR 360° video. Results from an eye-tracking experimentPerspectives10.1080/0907676X.2023.2268122(1-23)Online publication date: 13-Nov-2023
  • (2022)Exploring the Perception of Additional Information Content in 360° 3D VR Video for Teaching and LearningVirtual Worlds10.3390/virtualworlds10100011:1(1-17)Online publication date: 13-May-2022
  • (2022)Investigating Sign Language Interpreter Rendering and Guiding Methods in Virtual Reality 360-Degree ContentProceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3517428.3563373(1-6)Online publication date: 23-Oct-2022
  • (2022)Beyond Subtitles: Captioning and Visualizing Non-speech Sounds to Improve Accessibility of User-Generated VideosProceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3517428.3544808(1-12)Online publication date: 23-Oct-2022
  • (2022)Subtitle-based Viewport Prediction for 360-degree Virtual Tourism Video2022 13th International Conference on Information, Intelligence, Systems & Applications (IISA)10.1109/IISA56318.2022.9904420(1-8)Online publication date: 18-Jul-2022
  • (2022)Universal access: user needs for immersive captioningUniversal Access in the Information Society10.1007/s10209-021-00828-w21:2(393-403)Online publication date: 1-Jun-2022
  • (2021)Evaluating AI assisted subtitlingProceedings of the 2021 ACM International Conference on Interactive Media Experiences10.1145/3452918.3458792(96-107)Online publication date: 21-Jun-2021
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media