Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild

Published: 01 January 2019 Publication History

Abstract

Automated affective computing in the wild setting is a challenging problem in computer vision. Existing annotated databases of facial expressions in the wild are small and mostly cover discrete emotions (aka the categorical model). There are very limited annotated facial databases for affective computing in the continuous dimensional model (e.g., valence and arousal). To meet this need, we collected, annotated, and prepared for public distribution a new database of facial emotions in the wild (called AffectNet). AffectNet contains more than 1,000,000 facial images from the Internet by querying three major search engines using 1,250 emotion related keywords in six different languages. About half of the retrieved images were manually annotated for the presence of seven discrete facial expressions and the intensity of valence and arousal. AffectNet is by far the largest database of facial expression, valence, and arousal in the wild enabling research in automated facial expression recognition in two different emotion models. Two baseline deep neural networks are used to classify images in the categorical model and predict the intensity of valence and arousal. Various evaluation metrics show that our deep neural network baselines can perform better than conventional machine learning methods and off-the-shelf facial expression recognition systems.

Cited By

View all
  • (2024)Analysis of Learner’s Emotional Engagement in Online Learning Using Machine Learning Adam Robust Optimization AlgorithmScientific Programming10.1155/2024/88861972024:1Online publication date: 5-Jun-2024
  • (2024)A Feature-Based Approach for Subtle Emotion Recognition in Realistic ScenariosCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678492(398-404)Online publication date: 5-Oct-2024
  • (2024)Research Progress of EEG-Based Emotion Recognition: A SurveyACM Computing Surveys10.1145/366600256:11(1-49)Online publication date: 8-Jul-2024
  • Show More Cited By
  1. AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image IEEE Transactions on Affective Computing
    IEEE Transactions on Affective Computing  Volume 10, Issue 1
    January 2019
    141 pages

    Publisher

    IEEE Computer Society Press

    Washington, DC, United States

    Publication History

    Published: 01 January 2019

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 10 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Analysis of Learner’s Emotional Engagement in Online Learning Using Machine Learning Adam Robust Optimization AlgorithmScientific Programming10.1155/2024/88861972024:1Online publication date: 5-Jun-2024
    • (2024)A Feature-Based Approach for Subtle Emotion Recognition in Realistic ScenariosCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678492(398-404)Online publication date: 5-Oct-2024
    • (2024)Research Progress of EEG-Based Emotion Recognition: A SurveyACM Computing Surveys10.1145/366600256:11(1-49)Online publication date: 8-Jul-2024
    • (2024)Refining Facial Expression Recognition with Bilinear ResSpikeNet (BRS-Net): ADeep Learning ApproachProceedings of the 2024 2nd Asia Conference on Computer Vision, Image Processing and Pattern Recognition10.1145/3663976.3664027(1-6)Online publication date: 26-Apr-2024
    • (2024)Interactions for Socially Shared Regulation in Collaborative Learning: An Interdisciplinary Multimodal DatasetACM Transactions on Interactive Intelligent Systems10.1145/365837614:3(1-34)Online publication date: 22-Apr-2024
    • (2024)Does Hard-Negative Contrastive Learning Improve Facial Emotion Recognition?Proceedings of the 2024 7th International Conference on Machine Vision and Applications10.1145/3653946.3653971(162-168)Online publication date: 12-Mar-2024
    • (2024)A Survey of Cutting-edge Multimodal Sentiment AnalysisACM Computing Surveys10.1145/365214956:9(1-38)Online publication date: 25-Apr-2024
    • (2024)An Investigation into the Impact of Occlusion on Facial Emotion Recognition in the WildProceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3652037.3663932(365-368)Online publication date: 26-Jun-2024
    • (2024)Revisiting Annotations in Online Student EngagementProceedings of the 2024 10th International Conference on Computing and Data Engineering10.1145/3641181.3641186(111-117)Online publication date: 15-Jan-2024
    • (2024)Exploring the Role of Empathy in Designing Social Robots for Elderly PeopleAdjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3631700.3664887(120-125)Online publication date: 27-Jun-2024
    • Show More Cited By

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media