Abstract
The goal of this study is to enhance the emotional experience of a viewer by using enriched multimedia content, which entices tactile sensation in addition to vision and auditory senses. A user-independent method of emotion recognition using electroencephalography (EEG) in response to tactile enhanced multimedia (TEM) is presented with an aim of enriching the human experience of viewing digital content. The selected traditional multimedia clips are converted into TEM clips by synchronizing them with an electric fan and a heater to add cold and hot air effect. This would give realistic feel to a viewer by engaging three human senses including vision, auditory, and tactile. The EEG data is recorded from 21 participants in response to traditional multimedia clips and their TEM versions. Self assessment manikin (SAM) scale is used to collect valence and arousal score in response to each clip to validate the evoked emotions. A t-test is applied on the valence and arousal values to measure any significant difference between multimedia and TEM clips. The resulting p-values show that traditional multimedia and TEM content are significantly different in terms of valence and arousal scores, which shows TEM clips have enhanced evoked emotions. For emotion recognition, twelve time domain features are extracted from the preprocessed EEG signal and a support vector machine is applied to classify four human emotions i.e., happy, angry, sad, and relaxed. An accuracy of 43.90% and 63.41% against traditional multimedia and TEM clips is achieved respectively, which shows that EEG based emotion recognition performs better by engaging tactile sense.
Similar content being viewed by others
References
Aftanas L, Reva N, Varlamov A, Pavlov S, Makhnev V (2004) Analysis of evoked eeg synchronization and desynchronization in conditions of emotional activation in humans: Temporal and topographic characteristics. Neurosci Behav Physiol 34(8):859–867
Alarcao SM, Fonseca MJ Emotions recognition using eeg signals: a survey. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2017.2714671
Allen JJ, Coan JA, Nazarian M (2004) Issues and assumptions on the road from raw signals to metrics of frontal eeg asymmetry in emotion. Biol Psychol 67(1):183–218
Anagnostopoulos C-N, Iliou T, Giannoukos I (2015) Features and classifiers for emotion recognition from speech: a survey from 2000 to 2011. Artif Intell Rev 43 (2):155–177
Balconi M, Lucchiari C (2008) Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis. Int J Psychophysiol 67(1):41–46
Basar E, Basar-Eroglu C, Karakas S, Schurmann M (1999) Oscillatory brain theory: a new trend in neuroscience. IEEE Eng Med Biol Mag 18(3):56–66
Bethel CL, Salomon K, Murphy RR, Burke JL (2007) Survey of psychophysiology measurements applied to human-robot interaction. In: The 16th IEEE international symposium on robot and human interactive communication, 2007. RO-MAN 2007. IEEE, pp 732–737
Bhatti AM, Majid M, Anwar SM, Khan B (2016) Human emotion recognition and analysis in response to audio music using brain signals. Comput Hum Behav 65:267–275
Chanel G, Rebetez C, Bétrancourt M, Pun T (2011) Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans Syst Man Cybern Part A Syst Hum 41(6):1052–1063
Cohen J (1960) A coefficient of agreement for nominal scales. Educ Psychol Meas 20(1):37–46
de Gelder B, De Borst A, Watson R (2015) The perception of emotion in body expressions. Wiley Interdiscip Rev Cogn Sci 6(2):149–158
Du Y, Zhang F, Wang Y, Bi T, Qiu J (2016) Perceptual learning of facial expressions. Vision Res 128:19–29
Fasel B, Luettin J (2003) Automatic facial expression analysis: a survey. Pattern Recogn 36(1):259–275
Gao Y, Lee HJ, Mehmood RM (2015) Deep learninig of eeg signals for emotion recognition. In: 2015 IEEE international conference on multimedia & expo workshops (ICMEW). IEEE, pp 1–5
Ghinea G, Timmerer C, Lin W, Gulliver SR (2014) Mulsemedia: state of the art, perspectives, and challenges. ACM Trans Multimed Comput Commun Appl (TOMM) 11 (1s) 17:1–23
Graziotin D, Wang X, Abrahamsson P (2015) Understanding the affect of developers: theoretical background and guidelines for psychoempirical software engineering. In: Proceedings of the 7th international workshop on social software engineering. ACM, pp 25–32
Heller W (1993) Neuropsychological mechanisms of individual differences in emotion, personality, and arousal. Neuropsychology 7(4):476–489
Jalilifard A, Pizzolato EB, Islam MK (2016) Emotion classification using single-channel scalp-eeg recording. In: 2016 IEEE 38th annual international conference of the engineering in medicine and biology society (EMBC). IEEE, pp 845–849
Kalsum T, Anwar SM, Majid M, Khan B, Ali SM (2018) Emotion recognition from facial expressions using hybrid feature descriptors. IET Image Process 12(6):1004–1012
Kim J, André E (2006) Emotion recognition using physiological and speech signal in short-term observation. In: International tutorial and research workshop on perception and interactive technologies for speech-based systems. Springer, pp 53–64
Kim J, André E (2008) Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell 30(12):2067–2083
Kim KH, Bang SW, Kim SR (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42(3):419–427
Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31
Kroupi E, Vesin J-M, Ebrahimi T (2016) Subject-independent odor pleasantness classification using brain and peripheral signals. IEEE Trans Affect Comput 7(4):422–434
Lahane P, Sangaiah AK (2015) An approach to eeg based emotion recognition and classification using kernel density estimation. Procedia Computer Science 48:574–581
Lin H, Schulz C, Straube T (2016) Effects of expectation congruency on event-related potentials (erps) to facial expressions depend on cognitive load during the expectation phase. Biol Psychol 120:126–136
Lin Y-P, Wang C-H, Jung T-P, Wu T-L, Jeng S-K, Duann JR, Chen JH (2010) Eeg-based emotion recognition in music listening. IEEE Trans Biomed Eng 57(7):1798–1806
Liu S, Tong J, Xu M, Yang J, Qi H, Ming D (2016) Improve the generalization of emotional classifiers across time by using training samples from different days. In: 2016 IEEE 38th annual international conference of the engineering in medicine and biology society (EMBC). IEEE, pp 841–844
Matlovič T (2016) Emotion detection using epoc eeg device. In: Information and informatics technologies student research conference (IIT.SRC), pp 1–6
Murray N, Qiao Y, Lee B, Karunakar A, Muntean G-M (2013) Subjective evaluation of olfactory and visual media synchronization. In: Proceedings of the 4th ACM multimedia systems conference. ACM, pp 162–171
Murray N, Qiao Y, Lee B, Muntean G-M (2014) User-profile-based perceived olfactory and visual media synchronization. ACM Trans Multimed Comput Commun Appl (TOMM) 10 (1s) 11:1–24
Pan J, Li Y, Wang J (2016) An eeg-based brain-computer interface for emotion recognition. In: 2016 international joint conference on neural networks (IJCNN). IEEE, pp 2063–2067
Picard RW, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23 (10):1175–1191
Qayyum H, Majid M, Anwar SM, Khan B (2017) Facial expression recognition using stationary wavelet transform features. Math Probl Eng 2017:9
Sanei S, Chambers JA (2007) EEG Signal processing. Wiley Online Library
Sarlo M, Buodo G, Poli S, Palomba D (2005) Changes in eeg alpha power to different disgust elicitors: the specificity of mutilations. Neurosci Lett 382(3):291–296
Şen B, Peker M, Çavuşoğlu A, Çelebi FV (2014) A comparative study on classification of sleep stage based on eeg signals using feature selection and classification algorithms. J Med Syst 38(18):1–21
Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (eeg) distinguishes valence and intensity of musical emotions. Cognit Emot 15(4):487–500
Schutter DJ, Putman P, Hermans E, van Honk J (2001) Parietal electroencephalogram beta asymmetry and selective attention to angry facial expressions in healthy human subjects. Neurosci Lett 314(1):13–16
Singh H, Bauer M, Chowanski W, Sui Y, Atkinson D, Baurley S, Fry M, Evans J, Bianchi-Berthouze N (2014) The brain’s response to pleasant touch: an eeg investigation of tactile caressing. Front Hum Neurosci 8:893
Soleymani M, Pantic M, Pun T (2012) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3(2):211–223
Subramanian R, Wache J, Abadi MK, Vieriu RL, Winkler S, Sebe N (2018) Ascertain: emotion and personality recognition using commercial sensors. IEEE Trans Affect Comput 9(2):147–160
Sulema Y (2016) Mulsemedia vs. multimedia: state of the art and future trends. In: 2016 international conference on systems, signals and image processing (IWSSIP). IEEE, pp 1–5
Yuan Z, Chen S, Ghinea G, Muntean G-M (2014) User quality of experience of mulsemedia applications. ACM Trans Multimed Comput Commun Appl (TOMM) 11 15(1s):1–19
Zhuang N, Zeng Y, Tong L, Zhang C, Zhang H, Yan B (2017) Emotion recognition from eeg signals using multidimensional information in emd domain. Biomed Res Int 2017:9
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Raheel, A., Anwar, S.M. & Majid, M. Emotion recognition in response to traditional and tactile enhanced multimedia using electroencephalography. Multimed Tools Appl 78, 13971–13985 (2019). https://doi.org/10.1007/s11042-018-6907-3
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-018-6907-3