Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1501750.1501809acmconferencesArticle/Chapter ViewAbstractPublication PagesesemConference Proceedingsconference-collections
research-article

Facial expressions as game input with different emotional feedback conditions

Published: 03 December 2008 Publication History

Abstract

We propose a game design approach that utilizes facial expressions as an input method under different emotional feedback configurations. A study was conducted in a shopping centre to assess our game "EmoFlowers" focusing on user experience and user effectiveness. The study revealed that interaction with a game via facial expression is perceived naturally, is easy to learn, and provides a positive user experience.

References

[1]
K. Al-Zubaidi and G. Stevens. Mensch & Computer 2004: Allgegenwäärtige Interaktion., chapter CSCP at Work. Number 137--146. Oldenbourg Verlag, 2004.
[2]
R. Bernhaupt, A. Boldt, T. Mirlacher, D. Wilfinger, and M. Tscheligi. Using emotion in games: emotional flowers. In ACE '07: Proceedings of the international conference on Advances in computer entertainment technology, pages 41--48, New York, NY, USA, 2007. ACM Press.
[3]
P. Ekman, W. Friesen, and P. Ellworth. Emotion in the human face. Pergamon, New York, NY, USA, 1972.
[4]
R. L. Hazlett. Measuring emotional valence during interactive experiences: boys at video game play. In CHI '06: Proceedings of the SIGCHI conference on Human Factors in computing systems, pages 1023--1026, New York, NY, USA, 2006. ACM Press.
[5]
S. Mahlke, M. Minge, and M. Thüring. Measuring multiple components of emotions in interactive contexts. In CHI 2006, 2006.
[6]
R. L. Mandryk and K. M. Inkpen. Physiological indicators for the evaluation of co-located collaborative play. In CSCW '04: Proceedings of the 2004 ACM conference on Computer supported cooperative work, pages 102--111, New York, NY, USA, 2004. ACM Press.
[7]
M. Minge. Methoden zur erhebung emotionaler aspekte bei der interaktion mit technischen systemen. Master's thesis, Freie Universität Berlin Fachbereich Erziehungswissenschaften und Psychologie, 2005.
[8]
D. A. Norman. Emotional Design: Why We Love (Or Hate) Everyday Things. Basic Books, January 2004.
[9]
N. Ravaja, M. Salminen, J. Holopainen, T. Saari, J. Laarni, and A. Järvinen. Emotional response patterns and sense of presence during video games: potential criterion variables for game design. In NordiCHI '04: Proceedings of the third Nordic conference on Human-computer interaction, pages 339--347, New York, NY, USA, 2004. ACM.
[10]
J. Read. An Investigation of Participatory Design with Children - Informant, Balanced and Facilitated Design. Shaker Publishing, Eindhoven; Netherlands, 2002.
[11]
B. Reeves and C. Nass. The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places (CSLI Lecture Notes S.). Center for the Study of Language and Inf, January 2003.

Cited By

View all
  • (2024)EyeEcho: Continuous and Low-power Facial Expression Tracking on GlassesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642613(1-24)Online publication date: 11-May-2024
  • (2024)An Affect-Aware Game Adapting to Human EmotionHCI in Games10.1007/978-3-031-60692-2_21(307-322)Online publication date: 29-Jun-2024
  • (2023)Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game UsersIEEE Transactions on Affective Computing10.1109/TAFFC.2023.326500814:4(2582-2594)Online publication date: 1-Oct-2023
  • Show More Cited By

Index Terms

  1. Facial expressions as game input with different emotional feedback conditions

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ACE '08: Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology
    December 2008
    427 pages
    ISBN:9781605583938
    DOI:10.1145/1501750
    • General Chairs:
    • Masa Inakage,
    • Adrian David Cheok
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 03 December 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affective interfaces
    2. emotion
    3. facial expression
    4. feedback
    5. games
    6. user effectiveness
    7. user experience

    Qualifiers

    • Research-article

    Conference

    ACE2008
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 36 of 90 submissions, 40%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)13
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 02 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)EyeEcho: Continuous and Low-power Facial Expression Tracking on GlassesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642613(1-24)Online publication date: 11-May-2024
    • (2024)An Affect-Aware Game Adapting to Human EmotionHCI in Games10.1007/978-3-031-60692-2_21(307-322)Online publication date: 29-Jun-2024
    • (2023)Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game UsersIEEE Transactions on Affective Computing10.1109/TAFFC.2023.326500814:4(2582-2594)Online publication date: 1-Oct-2023
    • (2023)THIN: THrowable Information Networks and Application for Facial Expression Recognition in the WildIEEE Transactions on Affective Computing10.1109/TAFFC.2022.314443914:3(2336-2348)Online publication date: 1-Jul-2023
    • (2023)FERMOUTH: Facial Emotion Recognition from the MOUTH RegionImage Analysis and Processing – ICIAP 202310.1007/978-3-031-43148-7_13(147-158)Online publication date: 5-Sep-2023
    • (2022)Evoker: Narrative-based Facial Expression Game for Emotional Development of AdolescentsExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3516486(1-8)Online publication date: 27-Apr-2022
    • (2022)Play With Your Emotions: Exploring Possibilities of Emotions as Game Input in NEROExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3516485(1-7)Online publication date: 27-Apr-2022
    • (2022)Design and evaluation of a social and embodied multiplayer reading game to engage primary school learners in NamibiaBritish Journal of Educational Technology10.1111/bjet.1327153:6(1571-1590)Online publication date: 5-Sep-2022
    • (2022)Comparative Analysis of Techniques for Recognising Facial Expressions2022 8th International Conference on Advanced Computing and Communication Systems (ICACCS)10.1109/ICACCS54159.2022.9784977(233-238)Online publication date: 25-Mar-2022
    • (2022)Automatic facial emotion recognition at the COVID-19 pandemic timeMultimedia Tools and Applications10.1007/s11042-022-14050-082:9(12751-12769)Online publication date: 22-Oct-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media