Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2370216.2370265acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Identifying emotions expressed by mobile users through 2D surface and 3D motion gestures

Published: 05 September 2012 Publication History
  • Get Citation Alerts
  • Abstract

    Only intrusive and expensive ways of precisely expressing emotions has been proposed, which are not likely to appear soon in everyday Ubicomp environments. In this paper, we study to which extent we can identify the emotion a user is explicitly expressing through 2D and 3D gestures. Indeed users already often manipulate mobile devices with touch screen and accelerometers. We conducted a field study where we asked participants to explicitly express their emotion through gestures and to report their affective states. We contribute by (1) showing a high number of significant correlations in 3D motion descriptors of gestures and in the arousal dimension; (2) defining a space of affective gestures. We identify (3) groups of descriptors that structure the space and are related to arousal. Finally, we provide with (4) a preliminary model of arousal and we identify (5) interesting patterns in particular classes of gestures. Such results are useful for Ubicomp application designers in order to envision the use of gestures as a cheap and non-intrusive affective modality.

    References

    [1]
    L. Anthony, P. Carrington, P. Chu, C. Kidd, J. Lai, and A. Sears. Gesture dynamics: Features sensitive to task difficulty and correlated with physiological sensors. In MMCogEmS Workshop, ICMI'11, 2011.
    [2]
    G. Caridakis and K. Karpouzis. Full body expressivity analysis in 3d natural interaction: a comparative study. In AFFINE Workshop, ICMI'11, 2011.
    [3]
    K. Church, E. Hoggan, and N. Oliver. A study of mobile mood awareness and communication through mobimood. In Proc. of NordiCHI'10, pages 128--137. ACM, 2010.
    [4]
    C. Coutrix, G. Jacucci, I. Advouevski, V. Vervondel, M. Cavazza, S. W. Gilroy, and L. Parisi. Supporting multi-user participation with affective multimodal fusion. In Proc. of C5'11. IEEE, 2011.
    [5]
    C. Epp, M. Lippold, and R. L. Mandryk. Identifying emotional states using keystroke dynamics. In Proc. of CHI'11, pages 715--724. ACM, 2011.
    [6]
    P. Fagerberg, A. Ståhl, and K. Höök. emoto: emotionally engaging interaction. Personal and Ubiquitous Computing, 8:377--381, 2004.
    [7]
    S. Fdili Alaoui, B. Caramiaux, and M. Serrano. From dance to touch: movement qualities for interaction design. In Proc. CHI EA'11, pages 1465--1470. ACM, 2011.
    [8]
    A. Gluhak, M. Presser, L. Zhu, S. Esfandiyari, and S. Kupschick. Towards mood based mobile services and applications. In Proc. EuroSSC'07, pages 159--174. Springer-Verlag, 2007.
    [9]
    J. Heikkinen, J. Rantala, T. Olsson, R. Raisamo, J. Lylykangas, J. Raisamo, V. Surakka, and T. Ahmaniemi. Enhancing personal communication with spatial haptics: Two scenario-based experiments on gestural interaction. Journal of Visual Languages and Computing, 20(5):287--304, 2009.
    [10]
    J. Hektner, J. Schmidt, and M. Csikszentmihalyi. Experience Sampling Method: Measuring the Quality of Everyday Life. Sage Publications, 2007.
    [11]
    K. Hinckley and H. Song. Sensor synaesthesia: touch in motion, and motion in touch. In Proc. of CHI'11, pages 801--810. ACM, 2011.
    [12]
    M. Kipp and J.-C. Martin. Gesture and emotion: Can basic gestural form features discriminate emotions? In ACII'09, pages 1--8, 2009.
    [13]
    P. Lang. Behavioral treatment and bio-behavioral assessment: computer applications. Technology in mental health care delivery systems, pages 119--137, 1980.
    [14]
    A. Mehrabian. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology, 14:261--292, 1996.
    [15]
    M. Meijer. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13:247--268, 1989.
    [16]
    R. W. Picard. Affective Computing. MIT Press, 2000.
    [17]
    S. Pook, E. Lecolinet, G. Vaysseix, and E. Barillot. Control menus: excecution and control in a single interactor. In CHI EA '00, pages 263--264. ACM, 2000.
    [18]
    C. Randell, I. Anderson, H. Muller, A. Moore, P. Brock, and S. Baurley. The sensor sleeve: Sensing affective gestures. In On-Body Sensing Workshop, ISWC'05, 2005.
    [19]
    J. Rantala, R. Raisamo, J. Lylykangas, T. Ahmaniemi, J. Raisamo, J. Rantala, K. Makela, K. Salminen, and V. Surakka. The role of gesture types and spatial feedback in haptic communication. IEEE Transactions on Haptics, 4(4):295--306, 2011.
    [20]
    J. Rekimoto. Smartskin: an infrastructure for freehand manipulation on interactive surfaces. In Proc. of CHI'02, pages 113--120. ACM, 2002.
    [21]
    J. Ruiz, Y. Li, and E. Lank. User-defined motion gestures for mobile interaction. In Proc. of CHI'11, pages 197--206. ACM, 2011.
    [22]
    P. Sundström, A. Ståhl, and K. Höök. In situ informants exploring an emotional mobile messaging system in their everyday practice. Int. J. Hum.-Comput. Stud., 65:388--403, 2007.
    [23]
    H. Wang, H. Prendinger, and T. Igarashi. Communicating emotions in online chat using physiological sensors and animated text. In CHI EA '04, pages 1171--1174. ACM, 2004.
    [24]
    J. O. Wobbrock, M. R. Morris, and A. D. Wilson. User-defined gestures for surface computing. In Proc. of CHI'09, pages 1083--1092. ACM, 2009.

    Cited By

    View all
    • (2023)Survey on Emotion Sensing Using Mobile DevicesIEEE Transactions on Affective Computing10.1109/TAFFC.2022.322048414:4(2678-2696)Online publication date: 1-Oct-2023
    • (2020)Emotion Detection Based on Smartphone Using User Memory Tasks and VideosHuman Interaction, Emerging Technologies and Future Applications III10.1007/978-3-030-55307-4_37(244-249)Online publication date: 6-Aug-2020
    • (2019)Exploiting sensing devices availability in AR/VR deployments to foster engagementVirtual Reality10.1007/s10055-018-0357-023:4(399-410)Online publication date: 1-Dec-2019
    • Show More Cited By

    Index Terms

    1. Identifying emotions expressed by mobile users through 2D surface and 3D motion gestures

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      UbiComp '12: Proceedings of the 2012 ACM Conference on Ubiquitous Computing
      September 2012
      1268 pages
      ISBN:9781450312240
      DOI:10.1145/2370216
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 September 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. affective computing
      2. mobile user interfaces

      Qualifiers

      • Research-article

      Conference

      Ubicomp '12
      Ubicomp '12: The 2012 ACM Conference on Ubiquitous Computing
      September 5 - 8, 2012
      Pennsylvania, Pittsburgh

      Acceptance Rates

      UbiComp '12 Paper Acceptance Rate 58 of 301 submissions, 19%;
      Overall Acceptance Rate 764 of 2,912 submissions, 26%

      Upcoming Conference

      UBICOMP '24

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)8
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 12 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Survey on Emotion Sensing Using Mobile DevicesIEEE Transactions on Affective Computing10.1109/TAFFC.2022.322048414:4(2678-2696)Online publication date: 1-Oct-2023
      • (2020)Emotion Detection Based on Smartphone Using User Memory Tasks and VideosHuman Interaction, Emerging Technologies and Future Applications III10.1007/978-3-030-55307-4_37(244-249)Online publication date: 6-Aug-2020
      • (2019)Exploiting sensing devices availability in AR/VR deployments to foster engagementVirtual Reality10.1007/s10055-018-0357-023:4(399-410)Online publication date: 1-Dec-2019
      • (2018)Automation of feature engineering for IoT analyticsACM SIGBED Review10.1145/3231535.323153815:2(24-30)Online publication date: 5-Jun-2018
      • (2018)Exploiting IoT Technologies for Personalized Learning2018 IEEE Conference on Computational Intelligence and Games (CIG)10.1109/CIG.2018.8490454(1-8)Online publication date: 14-Aug-2018
      • (2018)BibliographyTraceable Human Experiment Design Research10.1002/9781119453635.biblio(235-246)Online publication date: 16-Feb-2018
      • (2017)A Predictive Linear Regression Model for Affective State Detection of Mobile Touch Screen UsersInternational Journal of Mobile Human Computer Interaction10.4018/IJMHCI.20170101039:1(30-44)Online publication date: 1-Jan-2017
      • (2017)How Busy Are You?Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025946(5346-5360)Online publication date: 2-May-2017
      • (2017)Once More, With FeelingProceedings of the 22nd International Conference on Intelligent User Interfaces10.1145/3025171.3025182(427-437)Online publication date: 7-Mar-2017
      • (2015)Towards affective touch interaction: predicting mobile user emotion from finger strokesJournal of Interaction Science10.1186/s40166-015-0013-z3:1Online publication date: 11-Nov-2015
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media