Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3657242.3658602acmotherconferencesArticle/Chapter ViewAbstractPublication PagesinteraccionConference Proceedingsconference-collections
short-paper
Open access

Exploring the Interplay between Facial Expression Recognition and Physical States

Published: 19 June 2024 Publication History
  • Get Citation Alerts
  • Abstract

    This paper suggests a new viewpoint in Facial Expression Recognition (FER), moving beyond conventional approaches focused on understanding human emotions to include also physical states expressions such as pain and effort. These expressions involve facial muscle activities that deviate from straightforward emotional expressions, often overlooked by existing datasets and classifiers that predominantly focus on emotional states. The study presented addresses inaccuracies in facial expression reporting when the input image corresponds to a physical state. By utilizing a pre-trained FER classifier on a specialized dataset, this research analyses the implications of lacking classifiers tailored for physical states. Critical issues in FER tasks are highlighted, revealing how datasets without physical states labels introduce bias and impact accuracy. We consider the UIBVFED Physical States dataset, a dataset featuring facial expressions of physical states, to be a significant contribution. This dataset addresses biased estimations in FER tasks and enhances the training of recognition systems, improving their suitability across diverse scenarios.

    References

    [1]
    K. I. K. Jajan and P. D. E. A. M. Abdulazeez, ‘Facial Expression Recognition Based on Deep Learning: A Review’, Indones. J. Comput. Sci., vol. 13, no. 1, Art. no. 1, 2024.
    [2]
    J. X.-Y. Lek and J. Teo, ‘Academic Emotion Classification Using FER: A Systematic Review’, Hum. Behav. Emerg. Technol., vol. 2023, pp. 1–27, May 2023.
    [3]
    P. Ekman and W. V. Friesen, ‘Facial Action Coding System: A Technique for the Measurement of Facial Movement’, Consult. Psychol. Press, 1978.
    [4]
    M. Mascaró-Oliver, R. Mas-Sansó, E. Amengual-Alcover, and M. F. Roig-Maimó, ‘On the Convenience of Using 32 Facial Expressions to Recognize the 6 Universal Emotions’, in Information Systems and Technologies, vol. 800, A. Rocha, H. Adeli, G. Dzemyda, F. Moreira, and V. Colla, Eds., in Lecture Notes in Networks and Systems, vol. 800., Cham: Springer Nature Switzerland, 2024, pp. 625–634.
    [5]
    Y. Wang, ‘A systematic review on affective computing: emotion models, databases, and recent advances’, Inf. Fusion, vol. 83–84, pp. 19–52, Jul. 2022.
    [6]
    G. Faigin, The artist's complete guide to facial expression. Watson-Guptill, 2012.
    [7]
    M. M. Oliver and E. A. Alcover, ‘UIBVFED: Virtual facial expression dataset’, PLOS ONE, vol. 15, no. 4, p. e0231266, Apr. 2020.
    [8]
    L. Colbois, T. de Freitas Pereira, and S. Marcel, ‘On the use of automatically generated synthetic image datasets for benchmarking face recognition’, in 2021 IEEE International Joint Conference on Biometrics (IJCB), Aug. 2021, pp. 1–8.
    [9]
    J. Del Aguila, L. M. González-Gualda, M. A. Játiva, P. Fernández-Sotos, A. Fernández-Caballero, and A. S. García, ‘How Interpersonal Distance Between Avatar and Human Influences Facial Affect Recognition in Immersive Virtual Reality’, Front. Psychol., vol. 12, p. 675515, 2021.
    [10]
    G. Carreto Picón, M. F. Roig-Maimó, M. Mascaró Oliver, E. Amengual Alcover, and R. Mas-Sansó, ‘Do Machines Better Understand Synthetic Facial Expressions than People?’, in Proceedings of the XXII International Conference on Human Computer Interaction, in Interacción ’22. New York, NY, USA: Association for Computing Machinery, Sep. 2022, pp. 1–5.
    [11]
    G. del Castillo Torres, M. F. Roig-Maimó, M. Mascaró-Oliver, E. Amengual-Alcover, and R. Mas-Sansó, ‘Understanding How CNNs Recognize Facial Expressions: A Case Study with LIME and CEM’, Sensors, vol. 23, no. 1, Art. no. 1, Jan. 2023.
    [12]
    ‘Autodesk Character Generator’. Accessed: Mar. 07, 2024. [Online]. Available: https://charactergenerator.autodesk.com/
    [13]
    M. Mascaró-Oliver, E. Amengual-Alcover, M. F. Roig-Maimó, and R. Mas-Sansó, ‘UIBVFEDPlus-Light: Virtual facial expression dataset with lighting’, PLOS ONE, vol. 18, no. 9, p. e0287006, Sep. 2023.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    Interacción '24: Proceedings of the XXIV International Conference on Human Computer Interaction
    June 2024
    155 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 June 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. HCI
    2. convolutional neural network
    3. facial expression datasets
    4. facial expression recognition
    5. machine learning
    6. synthetic avatars

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    Conference

    INTERACCION 2024

    Acceptance Rates

    Overall Acceptance Rate 109 of 163 submissions, 67%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 17
      Total Downloads
    • Downloads (Last 12 months)17
    • Downloads (Last 6 weeks)17
    Reflects downloads up to 26 Jul 2024

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media