Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3123024.3125608acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Deep affect recognition from R-R intervals

Published: 11 September 2017 Publication History
  • Get Citation Alerts
  • Abstract

    Affect recognition is an important task in ubiquitous computing, in particular in health and human-computer interaction. In the former, it contributes to the timely detection and treatment of emotional and mental disorders, and in the latter, it enables indigenous interaction and enhanced user experience. We present an inter-domain study for affect recognition on seven different datasets, recorded with six different sensors, three different sensor placements, 211 subjects and nearly 1000 hours of labelled data. The datasets are processed and translated into a common spectro-temporal space. The data represented in the common spectro-temporal space is used to train a deep neural network (DNN) for arousal recognition that benefits from the large amounts of data even when the data are heterogeneous (i.e., different sensors and different datasets). The DNN approach outperforms the classical machine-learning approaches in six out of seven datasets.

    References

    [1]
    R. Picard. Affective Computing. Cambridge, MA: MIT Press, 1997.
    [2]
    Depression cost: http://ec.europa.eu/health//sites/health/files/mental_health/docs/matrix_economic_analysis_mh_promotion_en.pdf, {Accessed 27.03.2017}.
    [3]
    J. A. Russell. A circumplex model of affect. Journal of Personality and Social Psychology, 1980.
    [4]
    R. Subramanian, J. Wache, M. Abadi, R. Vieriu, S. Winkler, N Sebe. ASCERTAIN: Emotion and Personality Recognition using Commercial Sensors. IEEE Transactions on Affective Computing. 2016.
    [5]
    S. Koelstra, C. Muehl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, I. Patras. DEAP: A Database for Emotion Analysis using Physiological Signals (PDF). IEEE Transaction on Affective Computing, 2012.
    [6]
    M.K. Abadi, R. Subramanian, S. M. Kia, P. Avesani, I. Patras. Nicu Sebe. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses. IEEE Transactions on Affective Computing, 2015.
    [7]
    N.R. Lomb. Least-squares frequency analysis of unequally spaced data. Astrophysics and Space Science, vol 39, pp. 447--462, 1976
    [8]
    F. J. O. Morales, D. Roggen. Deep Convolutional Feature Transfer Across Mobile Activity Recognition Domains, Sensor Modalities and Locations. Proceedings of the 2016 ACM International Symposium on Wearable Computers (ISWC), 2016.
    [9]
    G. Trigeorgis et al.Adieu features? End-to-end speech emotion recognition using a deep convolutional recurrent network," Conference on Acoustics, Speech and Signal Processing (ICASSP), 2016.
    [10]
    I. Abdic, L. Fridman, D. McDuff, E. Marchi, B. Reimer, Schuller, B. Driver Frustration Detection From Audio and Video. Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI'16), 2016.
    [11]
    M. Garbarino, M. Lai, D. Bender, R. W. Picard, S. Tognett. Empatica E3 - A wearable wireless multisensor device for real-time computerized biofeedback and data acquisition. 4th International Conference on Wireless Mobile Communication and Healthcare, pp. 3--6, 2014.
    [12]
    Microsoft band. https://www.microsoft.com/microsoft-band/en-us
    [13]
    D. Iacovielloa, A. Petraccab, M. Spezialettib, G. Placidib. A real-time classification algorithm for EEG-based BCI driven by self-induced emotions. Computer Methods and Programs in Biomedicine, 2015.
    [14]
    M. Khezria, M.Firoozabadib, A. R. Sharafata. Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals.
    [15]
    G. K. Verma, U. S. Tiwary. Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. NeuroImage, 2014.
    [16]
    R. M. Mehmooda, H. J. Leea. A novel feature extraction method based on late positive potential for emotion recognition in human brain signal patterns. Computers & Electrical Engineering, 2016.
    [17]
    A. Mikuckas, I. Mikuckiene, A. Venckauskas, E. Kazanavicius2, R. Lukas2, I. Plauska. Emotion Recognition in Human Computer Interaction Systems. Elektronika Ir Elektrotechnika, Reserarch Journal, Kaunas University of Technology, 2014.
    [18]
    Wei Liu, Wei-Long Zheng, Bao-Liang Lu. Multimodal Emotion Recognition Using Multimodal Deep Learning. Online. Available at: https://arxiv.org/abs/1602.08225, 2016.
    [19]
    W-L. Zheng, B-L Lu. A multimodal approach to estimating vigilance using EEG and forehead EOG. Journal of Neural Engineering, 2017.
    [20]
    P. Bashivan, I. Rish, M. Yeasin, N. Codella. Learning Representations From Eeg With Deep Recurrent-Convolutional Neural Networks. Online. Available at: https://arxiv.org/abs/1511.06448.
    [21]
    Z. Yin, M. Zhao, Y. Wang, J. Yang, J. Zhang. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model. Comput Methods Programs Biomed. 2017.
    [22]
    H.P. Martínez, Y. Bengio, G. N. Yannakakis. Learning Deep Physiological Models of Affect. IEEE Computational intelligence magazine, 2013.
    [23]
    K.Weiss, T. M. Khoshgoftaar, D. Wang. A survey of transfer learning. Journal of Big Data, 2016.
    [24]
    S. Schneegass, B. Pfleging, N. Broy, A. Schmidt, Frederik Heinrich. A Data Set of Real World Driving to Assess Driver Workload. 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2013.
    [25]
    M. Gjoreski, M. Luštrek, M. Gams, H. Gjoreski. Monitoring stress with a wrist device using context. Journal of Biomedical Informatics, 2017, in press.
    [26]
    M. Gjoreski, H. Gjoreski, M. Luštrek, M. Gams. Continuous stress detection using a wrist device: in laboratory and real life. ACM Conf. on Ubiquitous Computing, Workshop on mentalhealth, pp. 1185--1193, 2016.
    [27]
    M. Soleymani, T. Pun. A Multimodal Database for Affect Recognition and Implicit Tagging, IEEE Transactions On Affective Computing, 2012.
    [28]
    L. H. Negri. Peak detection algorithm. Python Implementation. Online. Available at: http://pythonhosted.org/PeakUtils/.
    [29]
    M. Wu, PhD thesis. Michigan State University; 2006. Trimmed and Winsorized Eestimators.
    [30]
    J.D. Scargle. Studies in astronomical time series analysis. II - Statistical aspects of spectral analysis of unevenly spaced data. The Astrophysical Journal, vol 263, pp. 835--853, 1982.
    [31]
    D. P. Kingma, J. Ba. Adam: A Method for Stochastic Optimization, http://arxiv.org/abs/1412.6980, 2014.
    [32]
    Tensorflow. Online. Available at: https://www.tensorflow.org/
    [33]
    R. Castaldoa, P. Melillob, U. Bracalec, M. Casertaa,c, M. Triassic, L. Pecchiaa. Acute mental stress assessment via short term HRV analysis in healthy adults: A systematic review with meta-analysis. Biomedical Signal Processing and Control. 2015.
    [34]
    Scikit-learn, Python machine-learning library http://scikit-learn.org/dev/_downloads/scikit-learn-docs.pdf
    [35]
    L.J.P, van der Maaten., G.E. Hinton. Visualizing High-Dimensional Data Using t-SNE. Journal of Machine Learning Research. 9: 2579--2605, 2008.

    Cited By

    View all
    • (2023)Mitigating Emotional Harm on Social Media: A Filtering Approach Using Synesketch and Euclidean DistanceArtificial Intelligence Doctoral Symposium10.1007/978-981-99-4484-2_20(263-277)Online publication date: 15-Jul-2023
    • (2022)Emotion Detection and Classification Using Machine Learning TechniquesMultidisciplinary Applications of Deep Learning-Based Artificial Emotional Intelligence10.4018/978-1-6684-5673-6.ch002(11-31)Online publication date: 21-Oct-2022
    • (2022)Emotion Classification using Physiological Signals: A Recent Survey2022 IEEE International Conference on Signal Processing, Informatics, Communication and Energy Systems (SPICES)10.1109/SPICES52834.2022.9774240(333-338)Online publication date: 10-Mar-2022
    • Show More Cited By

    Index Terms

    1. Deep affect recognition from R-R intervals

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        UbiComp '17: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers
        September 2017
        1089 pages
        ISBN:9781450351904
        DOI:10.1145/3123024
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 11 September 2017

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. affect
        2. arousal recognition
        3. deep neural networks
        4. emotions
        5. machine learning
        6. stress
        7. transfer learning

        Qualifiers

        • Research-article

        Conference

        UbiComp '17

        Acceptance Rates

        Overall Acceptance Rate 764 of 2,912 submissions, 26%

        Upcoming Conference

        UBICOMP '24

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)12
        • Downloads (Last 6 weeks)2
        Reflects downloads up to 06 Aug 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2023)Mitigating Emotional Harm on Social Media: A Filtering Approach Using Synesketch and Euclidean DistanceArtificial Intelligence Doctoral Symposium10.1007/978-981-99-4484-2_20(263-277)Online publication date: 15-Jul-2023
        • (2022)Emotion Detection and Classification Using Machine Learning TechniquesMultidisciplinary Applications of Deep Learning-Based Artificial Emotional Intelligence10.4018/978-1-6684-5673-6.ch002(11-31)Online publication date: 21-Oct-2022
        • (2022)Emotion Classification using Physiological Signals: A Recent Survey2022 IEEE International Conference on Signal Processing, Informatics, Communication and Energy Systems (SPICES)10.1109/SPICES52834.2022.9774240(333-338)Online publication date: 10-Mar-2022
        • (2022)Biosignal-based user-independent recognition of emotion and personality with importance weightingMultimedia Tools and Applications10.1007/s11042-022-12711-881:21(30219-30241)Online publication date: 5-Apr-2022
        • (2021)Unsupervised multi-modal representation learning for affective computing with multi-corpus wearable dataJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-021-03462-914:4(3199-3224)Online publication date: 9-Oct-2021
        • (2020)Machine Learning and End-to-End Deep Learning for the Detection of Chronic Heart Failure From Heart SoundsIEEE Access10.1109/ACCESS.2020.29689008(20313-20324)Online publication date: 2020
        • (2019)Using Deep Convolutional Neural Network for Emotion Detection on a Physiological Signals Dataset (AMIGOS)IEEE Access10.1109/ACCESS.2018.28832137(57-67)Online publication date: 2019
        • (2019)Calibrating the Classifier: Siamese Neural Network Architecture for End-to-End Arousal Recognition from ECGMachine Learning, Optimization, and Data Science10.1007/978-3-030-13709-0_1(1-13)Online publication date: 14-Feb-2019

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media