Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3548814.3551466acmconferencesArticle/Chapter ViewAbstractPublication PagessapConference Proceedingsconference-collections
research-article
Open access

Attentional synchrony in films: A window to visuospatial characterization of events

Published: 22 September 2022 Publication History
  • Get Citation Alerts
  • Abstract

    The study of event perception emphasizes the importance of visuospatial attributes in everyday human activities and how they influence event segmentation, prediction and retrieval. Attending to these visuospatial attributes is the first step toward event understanding, and therefore correlating attentional measures to such attributes would help to further our understanding of event comprehension. In this study, we focus on attentional synchrony amongst other attentional measures and analyze select film scenes through the lens of a visuospatial event model. Here we present the first results of an in-depth multimodal (such as head-turn, hand-action etc.) visuospatial analysis of 10 movie scenes correlated with visual attention (eye-tracking 32 participants per scene). With the results, we tease apart event segments of high and low attentional synchrony and describe the distribution of attention in relation to the visuospatial features. This analysis gives us an indirect measure of attentional saliency for a scene with a particular visuospatial complexity, ultimately directing the attentional selection of the observers in a given context.

    References

    [1]
    Mehul Bhatt. 2018a. Cognitive media studies : Potentials for spatial cognition and AI research. Cognitive Processing 19, Suppl. 1 (2018), S6–S6. https://doi.org/10.1007/s10339-018-0884-3
    [2]
    Mehul Bhatt. 2018b. Minds. Movement. Moving Image.Cognitive Processing 19, Suppl. 1 (2018), S5–S5. https://doi.org/10.1007/s10339-018-0884-3
    [3]
    Tanya J Clarke, Mark F Bradshaw, David T Field, Sarah E Hampson, and David Rose. 2005. The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34, 10 (2005), 1171–1180.
    [4]
    James E Cutting. 2005. Perceiving scenes in film and in the world. Moving image theory: Ecological considerations (2005), 9–27.
    [5]
    James E Cutting and Lynn T Kozlowski. 1977. Recognizing friends by their walk: Gait perception without familiarity cues. Bulletin of the psychonomic society 9, 5 (1977), 353–356.
    [6]
    Martha J Farah, James W Tanaka, and H Maxwell Drain. 1995. What causes the face inversion effect?Journal of Experimental Psychology: Human perception and performance 21, 3(1995), 628.
    [7]
    J Randall Flanagan and Roland S Johansson. 2003. Action plans used in action observation. Nature 424, 6950 (2003), 769–771.
    [8]
    Julie Grèzes. 1998. Top down effect of strategy on the perception of human biological motion: A PET investigation. Cognitive Neuropsychology 15, 6-8 (1998), 553–582.
    [9]
    Paul Hemeren and Yves Rybarczyk. 2020. The Visual Perception of Biological Motion in Adults. In Modelling Human Motion. Springer, 53–71.
    [10]
    Shaul Hochstein and Merav Ahissar. 2002. View from the top: Hierarchies and reverse hierarchies in the visual system. Neuron 36, 5 (2002), 791–804.
    [11]
    Gunnar Johansson. 1973. Visual perception of biological motion and a model for its analysis. Perception & psychophysics 14, 2 (1973), 201–211.
    [12]
    Seokmin Kang and Barbara Tversky. 2016. From hands to minds: Gestures promote understanding. Cognitive Research: Principles and Implications 1, 1(2016), 1–15.
    [13]
    Vasiliki Kondyli, Jakob Suchan, and Mehul Bhatt. 2022. Grounding Embodied Multimodal Interaction: Towards Behaviourally Established Semantic Foundations for Human-Centered AI. In First International Workshop on Knowledge Representation for Hybrid Intelligence (KR4HI 2022)., part of International Conference on Hybrid Human-Artificial Intelligence (HHAI 2022), Amsterdam, The Netherlands.
    [14]
    Atesh Koul, Marco Soriano, Barbara Tversky, Cristina Becchio, and Andrea Cavallo. 2019. The kinematics that you do not expect: Integrating prior information and kinematics to understand intentions. Cognition 182(2019), 213–219.
    [15]
    Lester C Loschky, Adam M Larson, Joseph P Magliano, and Tim J Smith. 2015. What would Jaws do? The tyranny of film and the relationship between gaze and higher-level narrative film comprehension. PloS one 10, 11 (2015), e0142474.
    [16]
    Vipul Nair, Paul Hemeren, Alessia Vignolo, Nicoletta Noceti, Elena Nicora, Alessandra Sciutti, Francesco Rea, Erik Billing, Francesca Odone, and Giulio Sandini. 2020. Action similarity judgment based on kinematic primitives. In 2020 Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob). IEEE, 1–8.
    [17]
    Han Sloetjes and Peter Wittenburg. 2008. Annotation by category-ELAN and ISO DCR. In 6th international Conference on Language Resources and Evaluation (LREC 2008).
    [18]
    Tim J Smith, Daniel Levin, and James E Cutting. 2012. A window on reality: Perceiving edited moving images. Current Directions in Psychological Science 21, 2 (2012), 107–113.
    [19]
    Tim J Smith and Parag K Mital. 2013. Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes. Journal of vision 13, 8 (2013), 16–16.
    [20]
    Jakob Suchan and Mehul Bhatt. 2016a. The geometry of a scene: On deep semantics for visual perception driven cognitive film, studies. In 2016 IEEE Winter Conference on Applications of Computer Vision, WACV 2016, Lake Placid, NY, USA, March 7-10, 2016. IEEE Computer Society, 1–9. https://doi.org/10.1109/WACV.2016.7477712
    [21]
    Jakob Suchan and Mehul Bhatt. 2016b. Semantic question-answering with video and eye-tracking data: AI foundations for human visual perception driven cognitive film studies. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. 2633–2639.
    [22]
    Jakob Suchan, Mehul Bhatt, and Srikrishna Varadarajan. 2019. Out of sight but not out of mind: an answer set programming based online abduction framework for visual sensemaking in autonomous driving. arXiv preprint arXiv:1906.00107(2019).
    [23]
    Paolo Viviani and Natale Stucchi. 1992. Biological movements look uniform: evidence of motor-perceptual interactions.Journal of experimental psychology: Human perception and performance 18, 3(1992), 603.

    Cited By

    View all
    • (2023)How do drivers mitigate the effects of naturalistic visual complexity?Cognitive Research: Principles and Implications10.1186/s41235-023-00501-18:1Online publication date: 9-Aug-2023

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SAP '22: ACM Symposium on Applied Perception 2022
    September 2022
    86 pages
    ISBN:9781450394550
    DOI:10.1145/3548814
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 22 September 2022

    Check for updates

    Author Tags

    1. Attention
    2. Eye-tracking
    3. Human-interaction
    4. Visuoauditory cues

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    SAP '22
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 43 of 94 submissions, 46%

    Upcoming Conference

    SAP '24
    ACM Symposium on Applied Perception 2024
    August 30 - 31, 2024
    Dublin , Ireland

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)112
    • Downloads (Last 6 weeks)26
    Reflects downloads up to 10 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)How do drivers mitigate the effects of naturalistic visual complexity?Cognitive Research: Principles and Implications10.1186/s41235-023-00501-18:1Online publication date: 9-Aug-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media