Constructing an Emotion Estimation Model Based on EEG/HRV Indexes Using Feature Extraction and Feature Selection Algorithms
Abstract
:1. Introduction
2. Feature Extraction from EEG/HRV Data
2.1. EEG Indexes
2.2. HRV Indexes
3. Data Collection
3.1. Emotional Stimulus
3.2. Emotion Estimation toward Stimulus
- Emotions with Arousal > 5 and Valence ≥ 5 or Arousal = Valence = 5 belong to HAHV (the first quadrant);
- Emotions with Arousal ≤ 5 and Valence > 5 belong to HALV (the second quadrant);
- Emotions with Arousal < 5 and Valence ≤ 5 belong to LALV (the third quadrant);
- Emotions with Arousal ≥ 5 and Valence < 5 belong to LAHV (the fourth quadrant).
3.3. Experimental Procedure and Environment
- Participant sits on a chair and wears EEG sensor, pulse sensor, and earphone. Then, the recording of EEG and pulse wave data is started.
- Participant practices the experiment by using simplified procedures of steps (3) to (4).
- Participant waits for 10 min in a resting state (The first rest).
- Participant listens to the music for 1 min (the same 15-s song is repeated 4 times) and then uses SAM to perform self-assessment of his/her emotion evoked by the music with no time limit. Then, he/she rests for 2 min.
- Steps (3) and (4) are repeated until eight trials are finished. (Note that the music is changed for each trial). Then, the recording of EEG and pulse wave data is stopped.
3.4. Dataset Construction
4. Feature Selection
4.1. Correlation Ratio (CR)
4.2. Mutual Information (MI)
4.3. Importance of Random Forest (RF)
- The number of trees in the forest: 1000
- Criterion: Gini impurity (default)
- The maximum depth of the tree: None (default)
- The minimum number of samples required to split an internal node: 2 (default)
- Bootstrap: True (default)
- All other required parameters are set as default by the library.
4.4. SVM L1 Regularization Weight (SVM L1)
- Kernel: Linear
- The norm used in the penalization: L1 (assigning coefficients/weights to the features)
- Regularization parameter (C): 1.0 (default)
- All other required parameters are set as default by the library.
4.5. Feature Selection Ensemble
- The feature importance of each feature was calculated for each feature selection algorithm. Note that the features are the physiological indexes consisting of 22 EEG indexes and 14 HRV indexes.
- The feature importance values were normalized so that the maximum value was 1 and the minimum value was 0.
- For each feature, the average normalized feature importance values were calculated from the values of the four feature selection algorithms.
- All features were sorted in descending order by the average normalized feature importance values.
- The indexes in the top 10 were selected as important features.
5. Accuracy Verification and Discussion
5.1. Combination of Features
- The features were selected based on types of physiological indexes and calculation methods (Groups #1 to #5).
- All features employed in this research were selected (Group #6).
- The features were selected based on our proposed four feature selection methods (i.e., ensemble of the four feature selection methods, correlation ratio, mutual information, importance of random forest, and SVM L1 regularization weight) and the three emotion classification models (i.e., “HAHV, HALV, LALV, and LAHV”, “Low/High Arousal”, and “Low/High Valence”) (Groups #7 to #21).
- EEG group (#1) consists of all 11 EEG indexes employed in this study.
- MA15 EEG group (#2) consists of 15-window-sized moving averages of all 11 EEG indexes.
- TD HRV group (#3) consists of indexes calculated by all 11 HRV indexes calculated by time-domain analysis.
- FD HRV group (#4) consists of indexes calculated by all 3 HRV indexes calculated by frequency-domain analysis.
- TD HRV + FD HRV group (#5) consists of the combination of indexes from TD HRV (#3) and FD HRV (#4) groups.
- ALL group (#6) consists of indexes that combines the indexes from EEG (#1), MA15 EEG (#2), TD HRV (#3), and FD HRV (#4).
- ENSEMBLE (HAHV, HALV, LALV, and LAHV) group (#7) consists of the top 10 indexes that contribute to emotion estimation in the four-class emotion classification of HAHV, HALV, LALV, and LAHV.
- ENSEMBLE (Low/High Arousal) group (#8) consists of the top 10 indexes that have the largest contribution of emotion estimation in the binary emotion classification into Low Arousal and High Arousal.
- ENSEMBLE (Low/High Valence) group (#9) consists of the top 10 indexes that have the largest contribution in the binary emotion classification into Low Valence and High Valence.
- CR (HAHV, HALV, LALV, and LAHV) group (#10) consists of the top 10 indexes from the correlation ratio as feature selection method that contribute to emotion estimation in the four-class emotion classification of HAHV, HALV, LALV, and LAHV.
- CR (Low/High Arousal) group (#11) consists of the top 10 indexes from the correlation ratio as the feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Arousal and High Arousal.
- CR (Low/High Valence) group (#12) consists of the top 10 indexes from the correlation ratio as feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Valence and High Valence.
- MI (HAHV, HALV, LALV, and LAHV) group (#13) consists of the top 10 indexes from the mutual information as feature selection method that contribute to emotion estimation in the four-class emotion classification of HAHV, HALV, LALV, and LAHV.
- MI (Low/High Arousal) group (#14) consists of the top 10 indexes from the mutual information as the feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Arousal and High Arousal.
- MI (Low/High Valence) group (#15) consists of the top 10 indexes from the mutual information as the feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Valence and High Valence.
- RF (HAHV, HALV, LALV, and LAHV) group (#16) consists of the top 10 indexes from the mutual information as the feature selection method that contribute to emotion estimation in the four-class emotion classification of HAHV, HALV, LALV, and LAHV.
- RF (Low/High Arousal) group (#17) consists of the top 10 indexes from the importance of random forest as the feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Arousal and High Arousal.
- RF (Low/High Valence) group (#18) consists of the top 10 indexes from the importance of random forest as the feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Valence and High Valence.
- SVM L1 (HAHV, HALV, LALV, and LAHV) group (#19) consists of the top 10 indexes from the SVM L1 regularization weight as the feature selection method that contribute to emotion estimation in the four-class emotion classification of HAHV, HALV, LALV, and LAHV.
- SVM L1 (Low/High Arousal) group (#20) consists of the top 10 indexes from the SVM L1 regularization weight as the feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Arousal and High Arousal.
- SVM L1 (Low/High Valence) group (#21) consists of the top 10 indexes from the SVM L1 regularization weight rest as the feature selection method that contribute to emotion estimation in the four-class emotion classification of Low Valence and High Valence.
5.2. Cross Validation
5.3. Accuracy Verification Indexes
5.4. Accuracy Verification Results
6. Discussion
- A model with high accuracy can be achieved even without using all features from physiological signals, suggesting that the accuracy is not always improved by combining large number of multimodal physiological indexes.
- The model using features only from specific physiological indexes, such as EEG or HRV, may produce a high accuracy; however, the variability tends to be large.
- The accuracy can be improved by applying the moving average to the normal values of physiological indexes.
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Nocentini, O.; Fiorini, L.; Acerbi, G.; Sorrentino, A.; Mancioppi, G.; Cavallo, F. A Survey of Behavioral Models for Social Robots. Robotics 2019, 8, 54. [Google Scholar] [CrossRef] [Green Version]
- Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human Emotion Recognition: Review of Sensors and Methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, J.; Yin, Z.; Chen, P.; Nichele, S. Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Inf. Fusion 2020, 59, 103–126. [Google Scholar] [CrossRef]
- Conti, D.; Trubia, G.; Buono, S.; Di Nuovo, S.; Di Nuovo, A. Evaluation of a Robot-Assisted Therapy for Children with Autism and Intellectual Disability. In Proceedings of the Towards Autonomous Robotic Systems, Bristol, UK, 25–27 July 2018; Springer International Publishing: Cham, Switzerland, 2018; pp. 405–415. [Google Scholar]
- Cavallo, F.; Aquilano, M.; Bonaccorsi, M.; Limosani, R.; Manzi, A.; Carrozza, M.C.; Dario, P. Improving Domiciliary Robotic Services by Integrating the ASTRO Robot in an AmI Infrastructure. In Gearing Up and Accelerating Cross-Fertilization between Academic and Industrial Robotics Research in Europe; Springer International Publishing: Cham, Switzerland, 2014; pp. 267–282. [Google Scholar]
- Pudane, M.; Petrovica, S.; Lavendelis, E.; Ekenel, H.K. Towards Truly Affective AAL Systems. In Enhanced Living Environments: Algorithms, Architectures, Platforms, and Systems; Ganchev, I., Garcia, N.M., Dobre, C., Mavromoustakis, C.X., Goleva, R., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 152–176. ISBN 9783030107529. [Google Scholar]
- Sancarlo, D.; D’Onofrio, G.; Oscar, J.; Ricciardi, F.; Casey, D.; Murphy, K.; Giuliani, F.; Greco, A. MARIO Project: A Multicenter Survey About Companion Robot Acceptability in Caregivers of Patients with Dementia. In Ambient Assisted Living; Cavallo, F., Marletta, V., Monteriù, A., Siciliano, P., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 311–336. [Google Scholar]
- Matsumoto, K.; Yoshida, R.; Chen, F.; Sugaya, M. Emotion Aware Voice-Casting Robot for Rehabilitation Evaluated with Bio-signal Index. In Proceedings of the HCI International 2019—Late Breaking Posters, Orlando, FL, USA, 26–31 July 2019; Stephanidis, C., Antona, M., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 241–250. [Google Scholar]
- Delkhoon, M.A.; Lotfizadeh, F. An Investigation on the Effect of Gender on Emotional Responses and Purchasing Intention Due to Advertisements. JSSHR 2014, 2, 6–11. [Google Scholar]
- Panicker, S.S.; Gayathri, P. A survey of machine learning techniques in physiology based mental stress detection systems. Biocybern. Biomed. Eng. 2019, 39, 444–469. [Google Scholar] [CrossRef]
- Mohamad, Y.; Hettich, D.T.; Bolinger, E.; Birbaumer, N.; Rosenstiel, W.; Bogdan, M.; Matuz, T. Detection and Utilization of Emotional State for Disabled Users. In Proceedings of the Computers Helping People with Special Needs, Paris, France, 9–11 July 2014; Miesenberger, K., Fels, D., Archambault, D., Peňáz, P., Zagler, W., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 248–255. [Google Scholar]
- Wang, X.-W.; Nie, D.; Lu, B.-L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
- Katsigiannis, S.; Ramzan, N. DREAMER: A Database for Emotion Recognition through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices. IEEE J. Biomed. Health Inform. 2018, 22, 98–107. [Google Scholar] [CrossRef] [Green Version]
- Alarcão, S.M.; Fonseca, M.J. Emotions Recognition Using EEG Signals: A Survey. IEEE Trans. Affect. Comput. 2019, 10, 374–393. [Google Scholar] [CrossRef]
- Santhiya, P.; Chitrakala, S. A Survey on Emotion Recognition from EEG Signals: Approaches, Techniques & Challenges. In Proceedings of the 2019 International Conference on Vision towards Emerging Trends in Communication and Networking (ViTECoN), Vellore, India, 30–31 March 2019; pp. 1–6. [Google Scholar]
- Ikeda, Y.; Horie, R.; Sugaya, M. Estimating Emotion with Biological Information for Robot Interaction. Procedia Comput. Sci. 2017, 112, 1589–1600. [Google Scholar] [CrossRef]
- Krishna, N.M.; Sekaran, K.; Vamsi, A.V.N.; Ghantasala, G.S.P.; Chandana, P.; Kadry, S.; Blažauskas, T.; Damaševičius, R. An Efficient Mixture Model Approach in Brain-Machine Interface Systems for Extracting the Psychological Status of Mentally Impaired Persons Using EEG Signals. IEEE Access 2019, 7, 77905–77914. [Google Scholar] [CrossRef]
- Russell, J.A. A circumplex model of affect. J. Pers. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [Green Version]
- Šalkevicius, J.; Damaševičius, R.; Maskeliunas, R.; Laukienė, I. Anxiety Level Recognition for Virtual Reality Therapy System Using Physiological Signals. Electronics 2019, 8, 1039. [Google Scholar] [CrossRef] [Green Version]
- Navalyal, G.U.; Gavas, R.D. A dynamic attention assessment and enhancement tool using computer graphics. Hum.-Cent. Comput. Inf. Sci. 2014, 4, 11. [Google Scholar] [CrossRef] [Green Version]
- Kim, H.-G.; Cheon, E.-J.; Bai, D.-S.; Lee, Y.H.; Koo, B.-H. Stress and Heart Rate Variability: A Meta-Analysis and Review of the Literature. Psychiatry Investig. 2018, 15, 235–245. [Google Scholar] [CrossRef] [Green Version]
- Moscato, F.; Granegger, M.; Edelmayer, M.; Zimpfer, D.; Schima, H. Continuous monitoring of cardiac rhythms in left ventricular assist device patients. Artif. Organs 2013, 38, 191–198. [Google Scholar] [CrossRef]
- Trimmel, M. Relationship of heart rate variability (HRV) parameters including pNNxx with the subjective experience of stress, depression, well-being, and every-day trait moods (TRIM-T): A pilot study. Ergon. Open J. 2015, 8, 32–37. [Google Scholar] [CrossRef]
- Naoto, U.; Midori, S. An Emotion Classification Method for Individuals Using EEG and Heart Rate Data and Deep Learning. In Proceedings of the Annual Conference of JSAI 2020, 2F6GS1302, Kumamoto, Japan, 9–12 June 2020. (In Japanese). [Google Scholar]
- Khalid, S.; Khalil, T.; Nasreen, S. A survey of feature selection and feature extraction techniques in machine learning. In Proceedings of the 2014 Science and Information Conference, London, UK, 27–29 August 2014; pp. 372–378. [Google Scholar]
- Tong, Z.; Chen, X.; He, Z.; Tong, K.; Fang, Z.; Wang, X. Emotion Recognition Based on Photoplethysmogram and Electroencephalogram. In Proceedings of the 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC), Tokyo, Japan, 23–27 July 2018; Volume 2, pp. 402–407. [Google Scholar]
- Lim, C.A.; Chia, W.C.; Chin, S.W. A mobile driver safety system: Analysis of single-channel EEG on drowsiness detection. In Proceedings of the 2014 International Conference on Computational Science and Technology (ICCST), Kota Kinabalu, Malaysia, 27–28 August 2014; pp. 1–5. [Google Scholar]
- Eerola, T.; Vuoskoski, J.K. A comparison of the discrete and dimensional models of emotion in music. Psychol. Music 2011, 39, 18–49. [Google Scholar] [CrossRef] [Green Version]
- Morshad, S.; Mazumder, R.; Ahmed, F. Analysis of Brain Wave Data Using Neurosky Mindwave Mobile II. In Proceedings of the International Conference on Computing Advancements (ICCA 2020), New York, NY, USA, 10–12 January 2020; Article 28. Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–4. [Google Scholar]
- Katona, J.; Farkas, I.; Ujbanyi, T.; Dukan, P.; Kovari, A. Evaluation of the NeuroSky MindFlex EEG headset brain waves data. In Proceedings of the 2014 IEEE 12th International Symposium on Applied Machine Intelligence and Informatics (SAMI), Herl’any, Slovakia, 23–25 January 2014; pp. 91–94. [Google Scholar] [CrossRef]
- NeuroSky Inc. MindSet Communications Protocol. Available online: http://developer.neurosky.com/docs/lib/exe/fetch.php?media=mindset_communications_protocol.pdf (accessed on 27 March 2021).
- Appelhans, B.M.; Luecken, L.J. Heart rate variability as an index of regulated emotional responding. Rev. Gen. Psychol. 2006, 10, 229–240. [Google Scholar] [CrossRef] [Green Version]
- Shinji, M.; Katayama, J.; Atsushi, M.; Ohsuga, M.; Nakata, A.; Izumi, H.; Moriya, T.; Kazuma, M.; Hachisuka, S. Physiological Measurement and Data Analysis Know-How for Product Development and Evaluation-Characteristics of Physiological Indicators, Measurement Methods, Experimental Design, Data Interpretation, and Evaluation Methods; Psychophysiology in Ergonomics: A Technical Group of Japan Ergonomics Society, Ed.; NTS CO., LTD: Tokyo, Japan, 2017. (In Japanese) [Google Scholar]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The Self-Assessment Manikin and the Semantic Differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
- Haq, A.U.; Zhang, D.; Peng, H.; Rahman, S.U. Combining Multiple Feature-Ranking Techniques and Clustering of Variables for Feature Selection. IEEE Access 2019, 7, 151482–151492. [Google Scholar] [CrossRef]
- Wagner, J.; Kim, J.; Andre, E. From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification. In Proceedings of the 2005 IEEE International Conference on Multimedia and Expo, Amsterdam, The Netherlands, 6–8 July 2005; pp. 940–943. [Google Scholar]
- Li, X.; Song, D.; Zhang, P.; Zhang, Y.; Hou, Y.; Hu, B. Exploring EEG Features in Cross-Subject Emotion Recognition. Front. Neurosci. 2018, 12, 162. [Google Scholar] [CrossRef] [Green Version]
- Strobl, C.; Boulesteix, A.-L.; Zeileis, A.; Hothorn, T. Bias in random forest variable importance measures: Illustrations, sources and a solution. BMC Bioinform. 2007, 8, 25. [Google Scholar] [CrossRef] [Green Version]
- Ng, A.Y. Feature selection, L1 vs. L2 regularization, and rotational invariance. In In Proceedings of the 21st International Conference on Machine Learning, New York, NY, USA, 4–8 July 2004; Association for Computing Machinery: New York, NY, USA, 2004; p. 78. [Google Scholar]
- Baek, H.J.; Cho, C.-H.; Cho, J.; Woo, J.-M. Reliability of Ultra-Short-Term Analysis as a Surrogate of Standard 5-Min Analysis of Heart Rate Variability. Telemed. E-Health 2015, 21, 404–414. [Google Scholar] [CrossRef]
- Hoffmann, B.; Flatt, A.A.; Silva, L.E.V.; Młyńczak, M.; Baranowski, R.; Dziedzic, E.; Werner, B.; Gąsior, J.S. A Pilot Study of the Reliability and Agreement of Heart Rate, Respiratory Rate and Short-Term Heart Rate Variability in Elite Modern Pentathlon Athletes. Diagnostics 2020, 10, 833. [Google Scholar] [CrossRef]
- Schaaff, K.; Adam, M.T.P. Measuring Emotional Arousal for Online Applications: Evaluation of Ultra-short Term Heart Rate Variability Measures. In Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Geneva, Switzerland, 2–5 September 2013; pp. 362–368. [Google Scholar]
- Siennicka, A.; Quintana, D.S.; Fedurek, P.; Wijata, A.; Paleczny, B.; Ponikowska, B.; Danel, D.P. Resting heart rate variability, attention and attention maintenance in young adults. Int. J. Psychophysiol. 2019, 143, 126–131. [Google Scholar] [CrossRef]
- Jerčić, P.; Astor, P.J.; Adam, M.; Hilborn, O.; Schaff, K.; Lindley, C.; Sennersten, C.; Eriksson, J. A Serious Game using Physiological Interfaces for Emotion Regulation Training in the context of Financial Decision-Making. In Proceedings of the 20th European Conference on Information Systems (ECIS 2012), Barcelona, Spain, 11–13 June 2012; AIS Electronic Library (AISeL). pp. 1–14. [Google Scholar]
- Kadowaki, D.; Sakata, R.; Hosaka, K.; Hiramatsu, Y. Winning Data Analysis Techniques on Kaggle; Gijutsu-Hyohron Co., Ltd.: Tokyo, Japan, 2019. (In Japanese) [Google Scholar]
- Tian, Y.; Luo, P.; Wang, X.; Tang, X. Deep Learning Strong Parts for Pedestrian Detection. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 1904–1912. [Google Scholar]
- Kooi, T.; Litjens, G.; van Ginneken, B.; Gubern-Mérida, A.; Sánchez, C.I.; Mann, R.; den Heeten, A.; Karssemeijer, N. Large Scale Deep Learning for Computer Aided Detection of Mammographic Lesions. Med. Image Anal. 2017, 35, 303–312. [Google Scholar] [CrossRef]
- Zheng, Q.; Zhao, P.; Li, Y.; Wang, H.; Yang, Y. Spectrum Interference-Based Two-Level Data Augmentation Method in Deep Learning for Automatic Modulation Classification. Neural Comput. Appl. 2020, 32, 1–23. [Google Scholar] [CrossRef]
- Wang, F.; Zhong, S.; Peng, J.; Jiang, J.; Liu, Y. Data Augmentation for EEG-Based Emotion Recognition with Deep Convolutional Neural Networks. In MultiMedia Modeling MMM 2018; Schoeffmann, K., Chalidabhongse, T.H., Ngo, C.W., Aramvith, S., O’Connor, N.E., Ho, Y.-S., Gabbouj, M., Elgammal, A., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2018; Volume 10705. [Google Scholar]
- Urolagin, S.; Prema, K.V.; Reddy, N.S. Generalization Capability of Artificial Neural Network Incorporated with Pruning Method. In Advanced Computing, Networking and Security. ADCONS 2011; Thilagam, P.S., Pais, A.R., Chandrasekaran, K., Balakrishnan, N., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7135. [Google Scholar]
- Gao, Z.; Wang, X.; Yang, Y.; Li, Y.; Ma, K.; Chen, G. A Channel-fused Dense Convolutional Network for EEG-based Emotion Recognition. IEEE Trans. Cogn. Dev. Syst. 2020, 1–10. [Google Scholar] [CrossRef]
EEG Index/Band | Frequency Band (Hz) | Interpretation |
---|---|---|
δ | 1–3 | Deepest sleep without dreams, unconscious, non-REM sleep, cognitive task by frontal lobe |
θ | 4–7 | Intuitive, creative, dream, recall, fantasy, imaginary, REM sleep |
α 1 | 8–12 | Relaxed but not sleepy, tranquil, conscious |
β 2 | 13–30 | Stress, wide awake, excited, conscious |
γ 3 | 31–50 | Cognition, motor function, higher mental activity |
Low α | 8–9 | Relaxed, peaceful, conscious |
High α | 10–12 | Relaxed but focused |
Low β | 13–17 | Thinking, accidents and environmental awareness, relaxed yet focused, integrated |
High β | 18–30 | Alert, upset, agitation |
Low γ | 31–40 | Memory, higher mental activity |
Mid γ | 41–50 | Visual information processing |
MA15 × 4 where x = {θ, δ, α, β, γ, Low α, High α, Low β, High β, Low γ, Mid γ} | Note 5 | Note 5 |
HRV Index | Definition | Interpretation |
---|---|---|
Inter-beat Interval (IBI) | Time interval between adjacent heartbeats | Sympathetic and parasympathetic nerves |
Heart Rate (HR) | Number of beats per minute | Tension, Calm |
pNNx 1 where x = {10, 20, 30, 40, 50} | Percentage of adjacent IBIs with absolute values greater than x ms | Parasympathetic nerve |
SDNN 1 | Standard deviation of IBI | Sympathetic and parasympathetic nerves |
RMSSD 1 | Root mean square of IBI difference | Parasympathetic nerve |
SDNN/RMSSD 1 | Ratio of SDNN by RMSSD | Sympathetic nerve |
CVNN 1 | Coefficient of variation of IBI | Sympathetic and parasympathetic nerves |
LF 2 | Frequency-domain analysis of IBI power value of 0.04–0.15 Hz | Sympathetic and parasympathetic nerves |
HF 2 | Frequency-domain analysis of IBI power value of 0.15–0.40 Hz | Parasympathetic nerve |
LF/HF 2 | LF/HF | Sympathetic nerve |
Group No. | Group Name | Feature Combination |
---|---|---|
#1 | EEG | θ, δ, Low α, High α, Low β, High β, Low γ, Mid γ, α, β, γ |
#2 | MA15 EEG | MA15 θ, MA15 δ, MA15 Low α, MA15 High α, MA15 Low β, MA15 High β, MA15 Low γ, MA15 Mid γ, MA15 α, MA15 β, MA15 γ |
#3 | TD HRV | IBI, HR, CVNN, SDNN, RMSSD, SDNN/RMSSD, pNN10, pNN20, pNN30, pNN40, pNN50 |
#4 | FD HRV | LF, HF, LF/HF |
#5 | TD HRV + FD HRV | IBI, HR, CVNN, SDNN, RMSSD, SDNN/RMSSD, pNN10, pNN20, pNN30, pNN40, pNN50, LF, HF, LF/HF |
#6 | ALL | θ, δ, Low α, High α, Low β, High β, Low γ, Mid γ, α, β, γ, MA15 θ, MA15 δ, MA15 Low α, MA15 High α, MA15 Low β, MA15 High β, MA15 Low γ, MA15 Mid γ, MA15 α, MA15 β, MA15 γ, IBI, HR, CVNN, SDNN, RMSSD, SDNN/RMSSD, pNN10, pNN20, pNN30, pNN40, pNN50, LF, HF, LF/HF |
#7 | ENSEMBLE (HAHV, HALV, LALV, LAHV) | LF, HF, LF/HF, RMSSD, SDNN, MA15 Mid γ, CVNN, pNN30, MA15 δ, pNN40 |
#8 | ENSEMBLE (Low/High Arousal) | RMSSD, SDNN/RMSSD, LF, LF/HF, HF, pNN30, pNN40, CVNN, SDNN, MA15 δ |
#9 | ENSEMBLE (Low/High Valence) | LF, MA15 Mid γ, HF, RMSSD, MA15 δ, LF/HF, SDNN, MA15 Low γ, MA15 γ, pNN40 |
#10 | CR (HAHV, HALV, LALV, LAHV) | MA15 Mid γ, LF/HF, MA15 γ, MA15 δ, MA15 Low γ, MA15 High β, SDNN/RMSSD, LF, MA15 High α, MA15 α |
#11 | CR (Low/High Arousal) | SDNN/RMSSD, LF/HF, MA15 δ, RMSSD, pNN10, MA15 Low α, MA15 Mid γ, MA15 Low β, Low α, pNN30 |
#12 | CR (Low/High Valence) | MA15 Mid γ, MA15 γ, MA15 δ, MA15 Low γ, MA15 α, MA15 Low α, MA15 θ, pNN50, MA15 High α, γ |
#13 | MI (HAHV, HALV, LALV, LAHV) | RMSSD, LF, HF, SDNN, CVNN, LF/HF, β, High β, Mid γ, γ |
#14 | MI (Low/High Arousal) | RMSSD, LF, HF, SDNN, High α, δ, Low β, θ, β, CVNN |
#15 | MI (Low/High Valence) | RMSSD, LF, β, SDNN, HF, γ, High β, δ, Low β, CVNN |
#16 | RF (HAHV, HALV, LALV, LAHV) | LF, HF, LF/HF, RMSSD, CVNN, SDNN/RMSSD, SDNN, MA15 Low γ, MA15 Mid γ, MA15 High β |
#17 | RF (Low/High Arousal) | LF, LF/HF, HF, RMSSD, SDNN/RMSSD, MA15 High β, CVNN, MA15 δ, MA15 θ, SDNN |
#18 | RF (Low/High Valence) | LF, HF, LF/HF, RMSSD, MA15 Low γ, CVNN, MA15 High β, MA15 Mid γ, MA15 High α, SDNN/RMSSD |
#19 | SVM L1 (HAHV, HALV, LALV, LAHV) | SDNN, pNN30, LF, HF, pNN40, pNN20, CVNN, pNN10, pNN50, MA15 Mid γ |
#20 | SVM L1 (Low/High Arousal) | pNN30, pNN40, MA15 Low β, pNN20, pNN10, RMSSD, MA15 δ, HR, MA15 High β, MA15 Low γ |
#21 | SVM L1 (Low/High Valence) | LF, HF, MA15 Mid γ, pNN40, MA15 δ, RMSSD, pNN30, pNN20, LF/HF, HR, |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Suzuki, K.; Laohakangvalvit, T.; Matsubara, R.; Sugaya, M. Constructing an Emotion Estimation Model Based on EEG/HRV Indexes Using Feature Extraction and Feature Selection Algorithms. Sensors 2021, 21, 2910. https://doi.org/10.3390/s21092910
Suzuki K, Laohakangvalvit T, Matsubara R, Sugaya M. Constructing an Emotion Estimation Model Based on EEG/HRV Indexes Using Feature Extraction and Feature Selection Algorithms. Sensors. 2021; 21(9):2910. https://doi.org/10.3390/s21092910
Chicago/Turabian StyleSuzuki, Kei, Tipporn Laohakangvalvit, Ryota Matsubara, and Midori Sugaya. 2021. "Constructing an Emotion Estimation Model Based on EEG/HRV Indexes Using Feature Extraction and Feature Selection Algorithms" Sensors 21, no. 9: 2910. https://doi.org/10.3390/s21092910