Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A Closed-Loop Brain-Computer Music Interface For Continuous Affective Interaction

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

A closed-loop Brain-Computer Music Interface for continuous affective

interaction

Stefan Ehrlich Cuntai Guan Gordon Cheng


Chair for Cognitive Systems School of Computer Science and Engineering Chair for Cognitive Systems
Technische Universität München Nanyang Technological University Technische Universität München
München, Germany Singapore München, Germany
Email: stefan.ehrlich@tum.de Email: ctguan@ntu.edu.sg Email: gordon@tum.de

Abstract—Research on human emotions and underly-


ing brain processes is mostly performed open-loop, e.g.
by presenting emotional stimuli and measuring subject’s
brain responses. Investigating human emotions in interac-
tion with emotional stimuli (closed-loop) significantly com-
plicates experimental setups and has so far rarely been
proposed. We present concept and technical realization
of an electroencephalography (EEG)-based affective Brain-
Computer Interface (BCI) to study emotional brain processes
in continuous closed-loop interaction. Our BCI consists of
an algorithm generating continuous patterns of synthesized
affective music, embedded in an online BCI architecture.
An initial calibration is employed to obtain user-specific
models associating EEG patterns with affective content in
musical patterns. These models are then used in online
application to translate the user’s affect into a continuous
musical representation; playback to the user results in closed-
loop affective brain-interactions. The proposed BCI provides
a platform to stimulate the brain in a closed-loop fashion,
offering novel approaches to study human sensorimotor
integration and emotions.
Keywords-Brain-Computer-Interface (BCI), affective BCI,
neurofeedback, emotions

I. I NTRODUCTION
In emotion research, Damasio’s work showed how
crucial emotions are to cognitive brain processes [1]. His Figure 1. Conceptual illustration of the affective music BCI: During cali-
research suggests that emotions serve the function of an bration (top) the user is exposed to algorithmically synthesized patterns of
important regulator for higher-order brain mechanisms affective music; brain activity is measured simultaneously via EEG; EEG
patterns are extracted and used to build a user-specific emotion model.
involved in reasoning and decision making. Moreover, During online application (bottom) the obtained model is used to translate
several challenging and hardly manageable mental the user’s brain activity continuously into a musical representation;
diseases, such as depression, have their origin in affective closing the loop by playing back the musical representation to the user
results in continuous affective brain interactions.
dysfunctions [2]. The rise of Affective Computing [3]
has attracted the technical area to become increasingly
interested in the neuroscientific underpinnings of The remainder of this paper is structured as follows: In
emotions. Hence, a better understanding of human the next section the affective BCI concept and related
emotions has interdisciplinary impact on many fields, work will be introduced, followed by the technical
such as technology, healthcare, and human-sciences. realization and evaluation in section III. In section IV we
discuss the work and conclude it in section V.
This work presents a novel non-invasive Brain-Computer
Interface (BCI) that feedbacks a subject’s emotional II. C ONCEPT AND RELATED WORK
state in a way such, that closed-loop affective brain A field, relevant to our concept is commonly referred
interaction is established. This affective interaction forms as EEG emotion recognition and affective BCI. It aims for
an alternative way of stimulating the brain and offers modeling links between neural correlates and emotions
novel approaches to gain fundamental knowledge about to enable a computer device to detect user’s affective
emotions, related brain processes and dysfunctions in states by means of recorded EEG signals. A common
these structures. procedure is exposing subjects to affective stimuli, such
as pictures, sound, or music and let them subjectively

978-1-5386-3276-5$31.00 2017
c IEEE 176
Figure 2. Algorithm to generate continuous patterns of synthesized affective music (composing unit): (a) Mapping of emotion-specific parameters
onto music structural control parameters according to results of psycho-physiological studies [6]; (b) state-machine generating streams of MIDI-
events, modulated by music structural control parameters; (c) translation of MIDI-patterns into sound. Right: Exemplary musical trajectory through
valence-arousal space according to Russel in 1980 [4].

rate their corresponding emotional responses, while basic architecture was mainly inspired by Wallis et al. in
brain activity is measured simultaneously. The subjective 2011 [12] (for the remainder of this paper this algorithm
ratings are later used to identify neural correlates which is called composing unit1 . The composing unit was imple-
are subsequently used to build emotion models using mented as a state machine (see Figure 2 (b)) generating
statistical modeling [5]. Decoding human affective states streams of MIDI-events. Transitions and states are modu-
is interesting for a multitude of applications, ranging lated by several continuously controllable music structural
from augmentation of human-computer and human-robot parameters, namely: harmonic mode, tempo, rhythmic
interaction [6], gaming and entertainment, e-learning, as shape, pitch, and relative loudness of subsequent notes.
well as in neuro-rehabilitation of psychiatric disorders [7]. According to their settings, different musical patterns
are generated. Emotional expressiveness of these patterns
A second field relevant to our concept is EEG sonification is realized by introducing mappings of emotion related
and Brain-Computer Music Interfaces. The basic idea is parameters (valence and arousal) onto the music structural
to translate human brain activity into sound or music. parameters (see Figure 2, (a)). As for these mappings, we
The first premier of a Brain-Computer Music Interface employed a subset of functional relationships proposed by
was Alvin Lucier’s “Music for Solo Performer” presented Gomez and Danuser in 2007 [13]. Most importantly, these
in 1965 [8]. More recent works have been presented by parameters and mappings are implemented such that con-
Makeig et al. in 2011 [9], De Smedt and Menschaert in tinuous and seamless transitions between musical patterns
2012 [10], and most recently by Deuel et al. in 2017 of different emotional expressiveness can be generated
[11]. The majority of these works were focused on (see Figure 2 (right)).
artistic purposes, whereby the latter particularly raised Evaluation: We conducted a study to evaluate whether
the potential use of their system as a biofeedback device. subjects rate their emotional reponses according to the
valence- and arousal-settings of the composing unit when
Our concept: The core component of our concept - and being exposed to the corresponding generated musical
the major contrast to previous works - is the utilization pattern. 11 healthy subjects (age: 26.9±3.4, 7 males)
of a single parametrizable music synthesis algorithm (see participated in this study. All subjects were exposed to
Figure 1). This algorithm allows for generating musical 13 emotionally different musical patterns (uniformely dis-
sequences with seamless and continuous transitions tributed) generated by the composing unit and asked to
between patterns of different emotional expressiveness. rate their emotional responses with the Self-Assessment
As such it is universally applicable in both, the calibration Manikin (SAM)2 scheme. The results showed that the
(EEG-based affect modeling), and the online application subjectively rated emotional response highly correlate
phase (continuous EEG-based affect translation). The (Spearman correlation coeff.: r = 0.52 ± 0.23 for valence
major benefit is improved calibration-to-application and r = 0.68±0.19 for arousal) with the intended emotion
immediacy and transferrablity, but also high flexibility for to be expressed with the generated musical patterns. This
developing innovative stimulation/calibration protocols. suggested that the synthesized musical patterns generated
by the composing unit successfully express the intended
III. T ECHNICAL R EALIZATION AND E VALUATION emotions. The composing unit was utilized and further
A. Algorithm to generate synthesized affective music evaluated in yet another work by Hagerer et al. in 2015 on
(composing unit) augmentation of affective speech with synthesized music
[14].
Music can be considered a combination of multiple
harmonic, rhythmic and timbre components which change
1 For sound examples the reader is referred to the following webpage:
over time and thus form a musical piece. Based on that
http://web.ics.ei.tum.de/∼ehrlich/affectiveBCI/index.htm
principle, we designed and implemented an algorithm to 2 A questionnaire based on symbols to measure emotional affect; by
generate continuous patterns of affective music whose M. Bradley and P. Lang in 1994

2017 International Conference on Orange Technologies (ICOT) 177


Figure 3. Signal processing and information flow in the affective BCI architecture. Top row (calibration phase): Selected patterns of synthesized
affective music are presened to a subject; brain activity is measured simultaneously via EEG and further processed to built a Linear-Discriminant
Analysis (LDA) model. Bottom row (application phase): Brain activity is measured and processed according to the training phase; the model output
is used to set control parameters (valence, arousal) for the composing unit, resulting in a continuous musical respresentation of the subjects affective
state.

B. Affective music BCI: Introducing the composing unit IV. D ISCUSSION AND FUTURE WORK
into an online BCI architecture Closing the loop allows BCI users to monitor their own
brain processes, which can trigger awareness, learning,
recovery or even enhancement of brain functionality
We embedded the composing unit into an online BCI
(neuroplasticity). Effective treatment protocols have
architecture, consisting of a calibration- and an online been developed for different neurological disorders,
application-phase. During the calibration phase (see Figure among others for stroke [15], and depression [16].
1 and 3, top) the subject is exposed to several musical pat- The concept of systematically investigating the human
terns generated by the composing unit, expressing different brain while being embedded in a BCI-loop has been
emotions. Brain activity is measured simultaneously via highlighted by Brunner et al. in 2015 as a promising key
EEG (14-channel emotiv EPOC system) and afterwards future field of BCI application [17] - utilizing BCI as a
used to build an emotion model: Selected brain activity is research/experimentation tool. Wander and Rao noted that
mapped onto control parameters for the composing unit by the most promising entry point to this avenue is the study
means of statistical modeling (details, see Figure 3). This of sensorimotor processing loops in motor imagery-based
model is then used during online application to translate BCIs [18]. A few studies have raised and discussed the
the subjects brain activity in real-time into parameter set- question of how the concept of closing the loop via sound
tings for the composing unit (see Figure 1 and 3, bottom). could be utilized for systematic investigation of the brain
The resulting outcome is a real-time representation of the in the loop or potential directions towards innovative
subjects affective state by means of a music-based emotion neurorehabilitation protocols [19]. To date, this approach
display. By playing back this musical representation, the is rather hypothetical and has not yet been systematically
subject’s affective state is influenced which again mani- explored.
fests in changes in the musical representation and so on:
An affective closed-loop interaction is established. With the work presented in this paper we aim to
Evaluation: In a second study, we investigated and set an entry point to this avenue. Musical stimuli permit
quantified the feedback modulations when subjects were the study of perception-action (sensorimotor) coupling
interacting with the music feedback. 5 healthy subjects as well as joint action and entrainment via reciprocal
(age: 27.8±5.0, all males) participated in this study for prediction and adaptation [20]. In future work, we are
two times (2 sessions) on different days. The subjects particularly interested in deepening the understanding
were asked to intentionally modulate the music feedback of human sensorimotor integration in which musical
in the closed-loop application of the system according sequences enable action upon the environment via
to specific tasks. The subjects were not given explicit entrainment. This could affect several populations who
information about how to achieve the modulations, but would benefit from the facilitation of sensorimotor
rather asked to develop individual mental strategies. The integration (the closed loop): for example those affected
results showed that 3 out of of 5 subjects achieved statis- by Parkinson’s disease [21] or congenital music disorders
tically significant modulations of the music feedback. In [22].
order to intentionally modulate the feedback, all subjects
stated to have retrieved emotional memories or imagined V. C ONCLUSION
happenings they look forward to in order to self-induce We have presented the concept, technical realization
emotions corresponding to the given modulation tasks. and evaluation of a non-invasive BCI to establish affective

178 2017 International Conference on Orange Technologies (ICOT)


closed-loop brain interaction by means of a music-based [9] S. Makeig, G. Leslie, T. Mullen, D. Sarma, N. Bigdely-
emotion display. The BCI is based on an algorithm (com- Shamlo, and C. Kothe, “First demonstration of a musical
posing unit) to generate continuous patterns of synthesized emotion BCI,” Affective Computing and Intelligent Inter-
action, pp. 487–496, 2011.
affective music employable as a controllable emotion
stimulus. We validated our BCI concept in two studies. [10] T. De Smedt and L. Menschaert, “VALENCE: affective
In our first study we successfully evaluated the affective visualisation using EEG,” Digital Creativity, vol. 23, no.
quality of synthesized musical patterns generated with 3-4, pp. 272–277, 2012.
the composing unit. In our second study we success-
[11] T. A. Deuel, J. Pampin, J. Sundstrom, and F. Darvas,
fully evaluated the establishement of affective closed-loop “The Encephalophone: A novel musical biofeedback device
interactions. The proposed BCI provides a platform to using conscious control of electroencephalogram (EEG),”
affectively stimulate the brain in a closed-loop fashion, Frontiers in human neuroscience, vol. 11, 2017.
offering novel approaches to study human sensorimotor
[12] I. Wallis, T. Ingalls, E. Campana, and J. Goodman, “A
integration and emotions. rule-based generative music system controlled by desired
valence and arousal,” in Proceedings of 8th international
ACKNOWLEDGMENT sound and music computing conference (SMC), 2011.
The authors would like to thank the staff and students [13] P. Gomez and B. Danuser, “Relationships between musical
from the Brain Computer Interface Lab, Institute for structure and psychophysiological measures of emotion.”
Infocomm Research (I2R), Agency for Science, Technol- Emotion, vol. 7, no. 2, p. 377, 2007.
ogy and Research (A*STAR), Singapore. This research
is supported by Deutsche Forschungsgemeinschaft (DFG) [14] G. J. Hagerer, M. Lux, S. Ehrlich, and G. Cheng, “Aug-
menting affect from speech with generative music,” in
through the International Graduate School of Science and Proceedings of the 33rd Annual ACM Conference Extended
Engineering (IGSSE), Technische Universität München, in Abstracts on Human Factors in Computing Systems. ACM,
collaboration with Georgetown University, USA, and the 2015, pp. 977–982.
University of Jyväskylä, Finland3 .
[15] K. K. Ang, C. Guan, K. S. G. Chua, B. T. Ang, C. Kuah,
R EFERENCES C. Wang, K. S. Phua, Z. Y. Chin, and H. Zhang, “Clinical
study of neurorehabilitation in stroke using EEG-based
[1] A. Damasio, Descartes’ error: Emotion, reason and the motor imagery brain-computer interface with robotic feed-
human brain. Vintage Digital, 2008. back,” in Engineering in Medicine and Biology Society
(EMBC), 2010 Annual International Conference of the
[2] R. J. Davidson, “Affective style and affective disorders: IEEE. IEEE, 2010, pp. 5549–5552.
Perspectives from affective neuroscience,” Cognition &
[16] R. Ramirez, M. Palencia-Lefler, S. Giraldo, and Z. Vam-
Emotion, vol. 12, no. 3, pp. 307–330, 1998.
vakousis, “Musical neurofeedback for treating depression
in elderly people,” Frontiers in neuroscience, vol. 9, 2015.
[3] R. W. Picard, Affective computing. MIT press, 2000.
[17] C. Brunner, N. Birbaumer, B. Blankertz, C. Guger,
[4] J. Russell, “A circumplex model of affect.” Journal of A. Kübler, D. Mattia, J. d. R. Millán, F. Miralles, A. Nijholt,
personality and social psychology, vol. 39, no. 6, p. 1161, E. Opisso et al., “BNCI Horizon 2020: towards a roadmap
1980. for the BCI community,” Brain-computer interfaces, vol. 2,
no. 1, pp. 1–10, 2015.
[5] S. M. Alarcao and M. J. Fonseca, “Emotions Recognition
Using EEG Signals: A Survey,” IEEE Transactions on [18] J. D. Wander and R. P. Rao, “Brain–computer interfaces:
Affective Computing, 2017. a powerful tool for scientific inquiry,” Current opinion in
neurobiology, vol. 25, pp. 70–75, 2014.
[6] S. Ehrlich, A. Wykowska, K. Ramirez-Amaro, and
G. Cheng, “When to engage in interactionand how? eeg- [19] A. Väljamäe, T. Steffert, S. Holland, X. Marimon, R. Ben-
based enhancement of robot’s ability to sense social signals itez, S. Mealla, A. Oliveira, and S. Jordà, “A review of
in hri,” in Humanoid Robots (Humanoids), 2014 14th IEEE- real-time EEG sonification research.” Georgia Institute of
RAS International Conference on. IEEE, 2014, pp. 1104– Technology, 2013.
1109.
[20] R. J. Zatorre, J. L. Chen, and V. B. Penhune, “When the
[7] A. S. Widge, D. D. Dougherty, and C. T. Moritz, “Af- brain plays music: auditory-motor interactions in music
fective brain-computer interfaces as enabling technology perception and production,” Nature reviews. Neuroscience,
for responsive psychiatric stimulation,” Brain-Computer vol. 8, no. 7, p. 547, 2007.
Interfaces, vol. 1, no. 2, pp. 126–136, 2014.
[21] C. Nombela, L. E. Hughes, A. M. Owen, and J. A.
[8] A. Lucier, “Statement on: music for solo performer,” Grahn, “Into the groove: can rhythm influence parkinson’s
Biofeedback and the Arts, Results of Early Experiments. disease?” Neuroscience & Biobehavioral Reviews, vol. 37,
Vancouver: Aesthetic Research Center of Canada Publica- no. 10, pp. 2564–2570, 2013.
tions, pp. 60–61, 1976.
[22] J. Phillips-Silver, P. Toiviainen, N. Gosselin, and I. Peretz,
3 “INTERACT: “Amusic does not mean unmusical: Beat perception and
Brain-To-Sound Computer Interfaces: Neurofeedback
synchronization ability despite pitch deafness,” Cognitive
of Music for Entrainment, Interaction and Neurorehabilitation” http://
www.igsse.gs.tum.de/index.php?id=85 neuropsychology, vol. 30, no. 5, pp. 311–331, 2013.

2017 International Conference on Orange Technologies (ICOT) 179

You might also like