Emotion recognition from multi-channel EEG via deep forest

J Cheng, M Chen, C Li, Y Liu, R Song… - IEEE Journal of …, 2020 - ieeexplore.ieee.org
IEEE Journal of Biomedical and Health Informatics, 2020ieeexplore.ieee.org
Recently, deep neural networks (DNNs) have been applied to emotion recognition tasks
based on electroencephalography (EEG), and have achieved better performance than
traditional algorithms. However, DNNs still have the disadvantages of too many
hyperparameters and lots of training data. To overcome these shortcomings, in this article,
we propose a method for multi-channel EEG-based emotion recognition using deep forest.
First, we consider the effect of baseline signal to preprocess the raw artifact-eliminated EEG …
Recently, deep neural networks (DNNs) have been applied to emotion recognition tasks based on electroencephalography (EEG), and have achieved better performance than traditional algorithms. However, DNNs still have the disadvantages of too many hyperparameters and lots of training data. To overcome these shortcomings, in this article, we propose a method for multi-channel EEG-based emotion recognition using deep forest. First, we consider the effect of baseline signal to preprocess the raw artifact-eliminated EEG signal with baseline removal. Secondly, we construct 2 frame sequences by taking the spatial position relationship across channels into account. Finally, 2 frame sequences are input into the classification model constructed by deep forest that can mine the spatial and temporal information of EEG signals to classify EEG emotions. The proposed method can eliminate the need for feature extraction in traditional methods and the classification model is insensitive to hyperparameter settings, which greatly reduce the complexity of emotion recognition. To verify the feasibility of the proposed model, experiments were conducted on two public DEAP and DREAMER databases. On the DEAP database, the average accuracies reach to 97.69% and 97.53% for valence and arousal, respectively; on the DREAMER database, the average accuracies reach to 89.03%, 90.41%, and 89.89% for valence, arousal and dominance, respectively. These results show that the proposed method exhibits higher accuracy than the state-of-art methods.
ieeexplore.ieee.org