Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications
In The formative years of Affective Computing [1], from the late 1990s and into the early 2000s, a significant fraction of research attention was focused on the development of methods for <italic>unobtrusive physiological measurement</italic>. It quickly ...
WiFE: WiFi and Vision Based Unobtrusive Emotion Recognition via Gesture and Facial Expression
Emotion plays a critical role in making the computer more human-like. As the first and most essential step, emotion recognition emerges recently as a hot but relatively nascent topic, i.e., current research mainly focuses on single modality (e.g., facial ...
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users
Emotional arousal, an essential dimension of game users’ experience, plays a crucial role in determining whether a game is successful. Game users’ emotion arousal assessment (GUEA) is of great importance. However, GUEA often faces challenges,...
Two Birds With One Stone: Knowledge-Embedded Temporal Convolutional Transformer for Depression Detection and Emotion Recognition
Depression is a critical problem in modern society that affects an estimated 350 million people worldwide, causing feelings of sadness and a lack of interest and pleasure. Emotional disorders are gaining interest and are closely entwined with depression, ...
Group Synchrony for Emotion Recognition Using Physiological Signals
During group interactions, we react and modulate our emotions and behaviour to the group through phenomena including emotion contagion and physiological synchrony. Previous work on emotion recognition through video/image has shown that group context ...
Virtual Reality for Emotion Elicitation – A Review
Emotions are multifaceted phenomena that affect our behaviour, perception, and cognition. Increasing evidence indicates that induction mechanisms play a crucial role in triggering emotions by simulating the sensations required for an experimental design. ...
A Neural Predictive Model of Negative Emotions for COVID-19
- Yu Mao,
- Dongtao Wei,
- Wenjing Yang,
- Qunlin Chen,
- Jiangzhou Sun,
- Yaxu Yu,
- Yu Li,
- Kaixiang Zhuang,
- Xiaoqin Wang,
- Li He,
- Tingyong Feng,
- Xu Lei,
- Qinghua He,
- Hong Chen,
- Shaozheng Qin,
- Yunzhe Liu,
- Jiang Qiu
The long-lasting global pandemic of Coronavirus disease 2019 (COVID-19) has changed our daily life in many ways and put heavy burden on our mental health. Having a predictive model of negative emotions during COVID-19 is of great importance for ...
Graph-Based Facial Affect Analysis: A Review
As one of the most important affective signals, facial affect analysis (FAA) is essential for developing human-computer interaction systems. Early methods focus on extracting appearance and geometry features associated with human affects while ignoring ...
Survey on Emotion Sensing Using Mobile Devices
- Kangning Yang,
- Benjamin Tag,
- Chaofan Wang,
- Yue Gu,
- Zhanna Sarsenbayeva,
- Tilman Dingler,
- Greg Wadley,
- Jorge Goncalves
The rapid development and ubiquity of mobile and wearable devices promises to enable researchers to monitor users’ granular emotional data in a less intrusive manner. Researchers have used a wide variety of mobile and wearable devices for this ...
Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation
Many areas in computer science are facing the need to analyze, quantify and reproduce movements expressing emotions. This paper presents a systematic review of the intelligible factors involved in the expression of emotions in human movement and posture. ...
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities
Empathy is a vital factor that contributes to mutual understanding, and joint problem-solving. In recent years, a growing number of studies have recognized the benefits of empathy and started to incorporate empathy in conversational systems. We refer to ...
EEG-Based Subject-Independent Emotion Recognition Using Gated Recurrent Unit and Minimum Class Confusion
Automatic emotion recognition based on electroencephalogram (EEG) has attracted rapidly increasing interests. Due to large inter-subject variabilities, subject-independent emotion recognition faces great challenges. Recently, domain adaptation methods ...
Spatial-Temporal Graphs Plus Transformers for Geometry-Guided Facial Expression Recognition
Facial expression recognition (FER) is of great interest to the current studies of human-computer interaction. In this paper, we propose a novel geometry-guided facial expression recognition framework, based on graph convolutional networks and ...
Interaction of Cognitive and Affective Load Within a Virtual City
Spatial navigation is an important aspect of everyday life but may be negatively impacted by both cognitive and affective load. Cognitive and affective load may be measured via autonomic arousal and increased load may lead to reduced navigational ...
TensorFormer: A Tensor-Based Multimodal Transformer for Multimodal Sentiment Analysis and Depression Detection
Sentiment analysis is an important research field aiming to extract and fuse sentimental information from human utterances. Due to the diversity of human sentiment, analyzing from multiple modalities is usually more accurate than from a single modality. ...
Pars-OFF: A Benchmark for Offensive Language Detection on Farsi Social Media
- Taha Shangipour Ataei,
- Kamyar Darvishi,
- Soroush Javdan,
- Amin Pourdabiri,
- Behrouz Minaei-Bidgoli,
- Mohammad Taher Pilehvar
With the increasing use of social media with its ability for users to share comments immediately, the extent of a system to identify offensive content has become a necessity in all languages. Due to the lack of publicly available resources on offensive ...
MIA-Net: Multi-Modal Interactive Attention Network for Multi-Modal Affective Analysis
When a multi-modal affective analysis model generalizes from a bimodal task to a trimodal or multi-modal task, it is usually transformed into a hierarchical fusion model based on every two pairwise modalities, similar to a binary tree structure. This ...
Estimating the Uncertainty in Emotion Class Labels With Utterance-Specific Dirichlet Priors
Emotion recognition is a key attribute for artificial intelligence systems that need to naturally interact with humans. However, the task definition is still an open problem due to the inherent ambiguity of emotions. In this paper, a novel Bayesian ...
Semi-Structural Interview-Based Chinese Multimodal Depression Corpus Towards Automatic Preliminary Screening of Depressive Disorders
Depression is a common psychiatric disorder worldwide. However, in China, a considerable number of patients with depression are not diagnosed, and most of them are not aware of their depression. Despite increasing efforts, the goal of automatic depression ...
Fine-Grained Domain Adaptation for Aspect Category Level Sentiment Analysis
Aspect category level sentiment analysis aims to identify the sentiment polarities towards the aspect categories discussed in a sentence. It usually suffers from a lack of labeled data. A popular solution is to transfer knowledge from a labeled source ...
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System
Affective computing strives to recognize a person's affective state (e.g., emotion, mood) based on what can be observed. However, electroencephalogram (EEG) and video technologies have not been widely adopted for daily life affect monitoring due to ...
<bold>F</bold>rustration <bold>R</bold>ecognition <bold>U</bold>sing <bold>S</bold>patio <bold>T</bold>emporal Data: A Novel Dataset and GCN Model to Recognize In-Vehicle Frustration
Frustration is an unpleasant emotion prevalent in several target applications of affective computing, such as human-machine interaction, learning, (online) customer interaction, and gaming. One idea to redeem this issue is to recognize frustration to ...
Multi-Order Networks for Action Unit Detection
Action Units (AU) are muscular activations used to describe facial expressions. Therefore accurate AU recognition unlocks unbiaised face representation which can improve face-based affective computing applications. From a learning standpoint AU detection ...
Shared-Private Memory Networks For Multimodal Sentiment Analysis
Text, visual, and acoustic are usually complementary in the Multimodal Sentiment Analysis (MSA) task. However, current methods primarily concern shared representations while neglecting the critical private aspects of data within individual modalities. In ...
Perceived Conversation Quality in Spontaneous Interactions
The quality of daily spontaneous conversations is of importance towards both our well-being as well as the development of interactive social agents. Prior research directly studying the quality of social conversations has operationalized it in narrow ...
Emotional Expressivity is a Reliable Signal of Surprise
We consider the problem of inferring what happened to a person in a social task from momentary facial reactions. To approach this, we introduce several innovations. First, rather than predicting what (observers think) someone feels, we predict objective ...
MMPosE: Movie-Induced Multi-Label Positive Emotion Classification Through EEG Signals
- Xiaobing Du,
- Xiaoming Deng,
- Hangyu Qin,
- Yezhi Shu,
- Fang Liu,
- Guozhen Zhao,
- Yu-Kun Lai,
- Cuixia Ma,
- Yong-Jin Liu,
- Hongan Wang
Emotional information plays an important role in various multimedia applications. Movies, as a widely available form of multimedia content, can induce multiple positive emotions and stimulate people's pursuit of a better life. Different from ...
Emotional Contagion-Aware Deep Reinforcement Learning for Antagonistic Crowd Simulation
The antagonistic behavior in the crowd usually exacerbates the seriousness of the situation in sudden riots, where the antagonistic emotional contagion and behavioral decision making play very important roles. However, the complex mechanism of ...
Audio-Visual Emotion Recognition With Preference Learning Based on Intended and Multi-Modal Perceived Labels
This article introduces a novel preference learning framework that simultaneously considers both the intended and the perceived labels while addressing the mismatches between them. Based on analyzing the discrepancies and agreements between the intended ...
Driver Emotion Recognition With a Hybrid Attentional Multimodal Fusion Framework
- Luntian Mou,
- Yiyuan Zhao,
- Chao Zhou,
- Bahareh Nakisa,
- Mohammad Naim Rastgoo,
- Lei Ma,
- Tiejun Huang,
- Baocai Yin,
- Ramesh Jain,
- Wen Gao
Negative emotions may induce dangerous driving behaviors leading to extremely serious traffic accidents. Therefore, it is necessary to establish a system that can automatically recognize driver emotions so that some actions can be taken to avoid traffic ...