Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3242969.3243023acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
short-paper
Open access

Automated Affect Detection in Deep Brain Stimulation for Obsessive-Compulsive Disorder: A Pilot Study

Published: 02 October 2018 Publication History

Abstract

Automated measurement of affective behavior in psychopathology has been limited primarily to screening and diagnosis. While useful, clinicians more often are concerned with whether patients are improving in response to treatment. Are symptoms abating, is affect becoming more positive, are unanticipated side effects emerging? When treatment includes neural implants, need for objective, repeatable biometrics tied to neurophysiology becomes especially pressing. We used automated face analysis to assess treatment response to deep brain stimulation (DBS) in two patients with intractable obsessive-compulsive disorder (OCD). One was assessed intraoperatively following implantation and activation of the DBS device. The other was assessed three months post-implantation. Both were assessed during DBS on and off conditions. Positive and negative valence were quantified using a CNN trained on normative data of 160 non-OCD participants. Thus, a secondary goal was domain transfer of the classifiers. In both contexts, DBS-on resulted in marked positive affect. In response to DBS-off, affect flattened in both contexts and alternated with increased negative affect in the outpatient setting. Mean AUC for domain transfer was 0.87. These findings suggest that parametric variation of DBS is strongly related to affective behavior and may introduce vulnerability for negative affect in the event that DBS is discontinued.

References

[1]
American Psychiatric Association. 2015. Diagnostic and statistical manual of mental disorders (DSM-5®) .American Psychiatric Pub.
[2]
Jason K Baker, John D Haltigan, Ryan Brewster, James Jaccard, and Daniel Messinger. 2010. Non-expert ratings of infant and parent emotion: Concordance with expert coding and relevance to early autism risk. International Journal of Behavioral Development, Vol. 34, 1 (2010), 88--95.
[3]
Edward M Bennett, R Alpert, and AC Goldstein. 1954. Communications through limited-response questioning. Public Opinion Quarterly, Vol. 18, 3 (1954), 303--308.
[4]
Robert L Brennan and Dale J Prediger. 1981. Coefficient kappa: Some uses, misuses, and alternatives. Educational and psychological measurement, Vol. 41, 3 (1981), 687--699.
[5]
John T Cacioppo, Jeffrey S Martzke, Richard E Petty, and Louis G Tassinary. 1988. Specific forms of facial EMG response index emotions during an interview: From Darwin to the continuous flow hypothesis of affect-laden information processing. Journal of personality and social psychology, Vol. 54, 4 (1988), 592.
[6]
John T Cacioppo, Richard E Petty, Mary E Losch, and Hai Sook Kim. 1986. Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. Journal of personality and social psychology, Vol. 50, 2 (1986), 260.
[7]
Kathleen Campbell, Kimberly LH Carpenter, Jordan Hashemi, Steven Espinosa, Samuel Marsan, Jana Schaich Borg, Zhuoqing Chang, Qiang Qiu, Saritha Vermeer, Elizabeth Adler, et almbox. 2018. Computer vision analysis captures atypical attention in toddlers with autism. Autism (2018), 1362361318766247.
[8]
Linda A Camras. 1992. Expressive development and basic emotions. Cognition & Emotion, Vol. 6, 3--4 (1992), 269--283.
[9]
Darwin Charles. 1872/1998. The expression of the emotions in man and animals 3rd ed.). Oxford University Press, New York.
[10]
Hamdi Dibeklioug lu, Zakia Hammal, and Jeffrey F Cohn. 2018. Dynamic multimodal measurement of depression severity using deep autoencoding. IEEE journal of biomedical and health informatics, Vol. 22, 2 (2018), 525--536.
[11]
Ulf Dimberg, Monika Thunberg, and Sara Grunedal. 2002. Facial reactions to emotional stimuli: Automatically controlled emotional responses. Cognition & Emotion, Vol. 16, 4 (2002), 449--471.
[12]
Paul Ekman, Richard J Davidson, and Wallace V Friesen. 1990. The Duchenne smile: Emotional expression and brain physiology: II. Journal of personality and social psychology, Vol. 58, 2 (1990), 342.
[13]
Paul Ekman and Wallace V Friesen. 2003. Unmasking the face: A guide to recognizing emotions from facial clues.Ishk.
[14]
Paul Ekman, Wallace V. Friesen, and Joseph C. Hager. 2002. Facial Action Coding System. Manual and Investigator-ôs Guide. (2002).
[15]
Jeffrey M. Girard and Jeffrey F. Cohn. 2018. Correlation of facial action units with observers' ratings of affect from video and audio recordings. (Unpublished) (2018).
[16]
BD Greenberg, LA Gabriels, DA Malone Jr, AR Rezai, GM Friehs, MS Okun, NA Shapira, KD Foote, PR Cosyns, CS Kubu, et almbox. 2010. Deep brain stimulation of the ventral internal capsule/ventral striatum for obsessive-compulsive disorder: worldwide experience. Molecular psychiatry, Vol. 15, 1 (2010), 64.
[17]
Nathaniel Haines. 2017. Decoding facial expressions that produce emotion valence ratings with human-like accuracy. Ph.D. Dissertation. The Ohio State University.
[18]
Zakia Hammal, Jeffrey F Cohn, and David T George. 2014. Interpersonal coordination of headmotion in distressed couples. IEEE transactions on affective computing, Vol. 5, 2 (2014), 155--167.
[19]
Carroll E Izard. 1993. Four systems for emotion activation: Cognitive and noncognitive processes. Psychological review, Vol. 100, 1 (1993), 68.
[20]
László A Jeni, Jeffrey F Cohn, and Takeo Kanade. 2017. Dense 3d face alignment from 2d video for real-time use. Image and Vision Computing, Vol. 58 (2017), 13--24.
[21]
László A Jeni, Sergey Tulyakov, Lijun Yin, Nicu Sebe, and Jeffrey F Cohn. 2016. The first 3d face alignment in the wild (3dfaw) challenge. In European Conference on Computer Vision. Springer, 511--520.
[22]
Jyoti Joshi, Abhinav Dhall, Roland Goecke, and Jeffrey F Cohn. 2013. Relative body parts movement for automatic depression analysis. In Affective Computing and Intelligent Interaction (ACII), 2013 Humaine Association Conference on. IEEE, 492--497.
[23]
Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
[24]
Patrick Lucey, Jeffrey F Cohn, Takeo Kanade, Jason Saragih, Zara Ambadar, and Iain Matthews. 2010. The extended cohn-kanade dataset (ck: A complete dataset for action unit and emotion-specified expression. In Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on. IEEE, 94--101.
[25]
Katherine B Martin, Zakia Hammal, Gang Ren, Jeffrey F Cohn, Justine Cassell, Mitsunori Ogihara, Jennifer C Britton, Anibal Gutierrez, and Daniel S Messinger. 2018. Objective measurement of head movement differences in children with and without autism spectrum disorder. Molecular autism, Vol. 9, 1 (2018), 14.
[26]
Evangelos Sariyanidi, Hatice Gunes, and Andrea Cavallaro. 2015. Automatic analysis of facial affect: A survey of registration, representation, and recognition. IEEE transactions on pattern analysis and machine intelligence, Vol. 37, 6 (2015), 1113--1133.
[27]
Stefan Scherer, Giota Stratou, Gale Lucas, Marwa Mahmoud, Jill Boberg, Jonathan Gratch, Louis-Philippe Morency, et almbox. 2014. Automatic audiovisual behavior descriptors for psychological disorder analysis. Image and Vision Computing, Vol. 32, 10 (2014), 648--658.
[28]
Utaka S Springer, Dawn Bowers, Wayne K Goodman, Nathan A Shapira, Kelly D Foote, and Michael S Okun. 2006. Long-term habituation of the smile response with deep brain stimulation. Neurocase, Vol. 12, 3 (2006), 191--196.
[29]
Michel F Valstar, Enrique Sánchez-Lozano, Jeffrey F Cohn, László A Jeni, Jeffrey M Girard, Zheng Zhang, Lijun Yin, and Maja Pantic. 2017. Fera 2017-addressing head pose in the third facial expression recognition and analysis challenge. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on. IEEE, 839--847.
[30]
Xing Zhang, Lijun Yin, Jeffrey F Cohn, Shaun Canavan, Michael Reale, Andy Horowitz, Peng Liu, and Jeffrey M Girard. 2014. Bp4d-spontaneous: a high-resolution spontaneous 3d dynamic facial expression database. Image and Vision Computing, Vol. 32, 10 (2014), 692--706.
[31]
Zheng Zhang, Jeff M Girard, Yue Wu, Xing Zhang, Peng Liu, Umur Ciftci, Shaun Canavan, Michael Reale, Andy Horowitz, Huiyuan Yang, et almbox. 2016. Multimodal spontaneous emotion corpus for human behavior analysis. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 3438--3446.
[32]
Xinshu Zhao, Jun S Liu, and Ke Deng. 2013. Assumptions behind intercoder reliability indices. Annals of the International Communication Association, Vol. 36, 1 (2013), 419--480.

Cited By

View all
  • (2023)Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of MethodsIEEE Transactions on Affective Computing10.1109/TAFFC.2021.312878714:2(1675-1695)Online publication date: 1-Apr-2023
  • (2023)Automated Emotional Valence Estimation in Infants with Stochastic and Strided Temporal Sampling2023 11th International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII59096.2023.10388096(1-8)Online publication date: 10-Sep-2023
  • (2022)Resting-state EEG-based convolutional neural network for the diagnosis of depression and its severityFrontiers in Physiology10.3389/fphys.2022.95625413Online publication date: 10-Oct-2022
  • Show More Cited By

Index Terms

  1. Automated Affect Detection in Deep Brain Stimulation for Obsessive-Compulsive Disorder: A Pilot Study

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICMI '18: Proceedings of the 20th ACM International Conference on Multimodal Interaction
    October 2018
    687 pages
    ISBN:9781450356923
    DOI:10.1145/3242969
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • SIGCHI: Specialist Interest Group in Computer-Human Interaction of the ACM

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 October 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. action units
    2. behavioral dynamics
    3. body expression
    4. deep brain stimulation
    5. facial expression
    6. obsessive compulsive disorder
    7. social signal processing

    Qualifiers

    • Short-paper

    Funding Sources

    Conference

    ICMI '18
    Sponsor:
    • SIGCHI

    Acceptance Rates

    ICMI '18 Paper Acceptance Rate 63 of 149 submissions, 42%;
    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)175
    • Downloads (Last 6 weeks)17
    Reflects downloads up to 20 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of MethodsIEEE Transactions on Affective Computing10.1109/TAFFC.2021.312878714:2(1675-1695)Online publication date: 1-Apr-2023
    • (2023)Automated Emotional Valence Estimation in Infants with Stochastic and Strided Temporal Sampling2023 11th International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII59096.2023.10388096(1-8)Online publication date: 10-Sep-2023
    • (2022)Resting-state EEG-based convolutional neural network for the diagnosis of depression and its severityFrontiers in Physiology10.3389/fphys.2022.95625413Online publication date: 10-Oct-2022
    • (2022)Assessing Multimodal Dynamics in Multi-Party Collaborative Interactions with Multi-Level Vector AutoregressionProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556595(615-625)Online publication date: 7-Nov-2022
    • (2021)Bias and Fairness in Multimodal Machine Learning: A Case Study of Automated Video InterviewsProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479897(268-277)Online publication date: 18-Oct-2021
    • (2021)Smile Action Unit detection from distal wearable Electromyography and Computer Vision2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021)10.1109/FG52635.2021.9667047(1-8)Online publication date: 15-Dec-2021
    • (2021)Facial Action Units and Head Dynamics in Longitudinal Interviews Reveal OCD and Depression severity and DBS Energy2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021)10.1109/FG52635.2021.9667028(1-6)Online publication date: 15-Dec-2021
    • (2021)Long-term ecological assessment of intracranial electrophysiology synchronized to behavioral markers in obsessive-compulsive disorderNature Medicine10.1038/s41591-021-01550-z27:12(2154-2164)Online publication date: 9-Dec-2021
    • (2020)Deep Brain Stimulation for Intractable Obsessive-Compulsive Disorder: Progress and OpportunitiesAmerican Journal of Psychiatry10.1176/appi.ajp.2020.20010037177:3(200-203)Online publication date: 1-Mar-2020
    • (2020)Multimodal, Multiparty Modeling of Collaborative Problem Solving PerformanceProceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418877(423-432)Online publication date: 22-Oct-2020
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media