A Closed-Loop Brain-Computer Music Interface For Continuous Affective Interaction
A Closed-Loop Brain-Computer Music Interface For Continuous Affective Interaction
A Closed-Loop Brain-Computer Music Interface For Continuous Affective Interaction
interaction
I. I NTRODUCTION
In emotion research, Damasio’s work showed how
crucial emotions are to cognitive brain processes [1]. His Figure 1. Conceptual illustration of the affective music BCI: During cali-
research suggests that emotions serve the function of an bration (top) the user is exposed to algorithmically synthesized patterns of
important regulator for higher-order brain mechanisms affective music; brain activity is measured simultaneously via EEG; EEG
patterns are extracted and used to build a user-specific emotion model.
involved in reasoning and decision making. Moreover, During online application (bottom) the obtained model is used to translate
several challenging and hardly manageable mental the user’s brain activity continuously into a musical representation;
diseases, such as depression, have their origin in affective closing the loop by playing back the musical representation to the user
results in continuous affective brain interactions.
dysfunctions [2]. The rise of Affective Computing [3]
has attracted the technical area to become increasingly
interested in the neuroscientific underpinnings of The remainder of this paper is structured as follows: In
emotions. Hence, a better understanding of human the next section the affective BCI concept and related
emotions has interdisciplinary impact on many fields, work will be introduced, followed by the technical
such as technology, healthcare, and human-sciences. realization and evaluation in section III. In section IV we
discuss the work and conclude it in section V.
This work presents a novel non-invasive Brain-Computer
Interface (BCI) that feedbacks a subject’s emotional II. C ONCEPT AND RELATED WORK
state in a way such, that closed-loop affective brain A field, relevant to our concept is commonly referred
interaction is established. This affective interaction forms as EEG emotion recognition and affective BCI. It aims for
an alternative way of stimulating the brain and offers modeling links between neural correlates and emotions
novel approaches to gain fundamental knowledge about to enable a computer device to detect user’s affective
emotions, related brain processes and dysfunctions in states by means of recorded EEG signals. A common
these structures. procedure is exposing subjects to affective stimuli, such
as pictures, sound, or music and let them subjectively
978-1-5386-3276-5$31.00 2017
c IEEE 176
Figure 2. Algorithm to generate continuous patterns of synthesized affective music (composing unit): (a) Mapping of emotion-specific parameters
onto music structural control parameters according to results of psycho-physiological studies [6]; (b) state-machine generating streams of MIDI-
events, modulated by music structural control parameters; (c) translation of MIDI-patterns into sound. Right: Exemplary musical trajectory through
valence-arousal space according to Russel in 1980 [4].
rate their corresponding emotional responses, while basic architecture was mainly inspired by Wallis et al. in
brain activity is measured simultaneously. The subjective 2011 [12] (for the remainder of this paper this algorithm
ratings are later used to identify neural correlates which is called composing unit1 . The composing unit was imple-
are subsequently used to build emotion models using mented as a state machine (see Figure 2 (b)) generating
statistical modeling [5]. Decoding human affective states streams of MIDI-events. Transitions and states are modu-
is interesting for a multitude of applications, ranging lated by several continuously controllable music structural
from augmentation of human-computer and human-robot parameters, namely: harmonic mode, tempo, rhythmic
interaction [6], gaming and entertainment, e-learning, as shape, pitch, and relative loudness of subsequent notes.
well as in neuro-rehabilitation of psychiatric disorders [7]. According to their settings, different musical patterns
are generated. Emotional expressiveness of these patterns
A second field relevant to our concept is EEG sonification is realized by introducing mappings of emotion related
and Brain-Computer Music Interfaces. The basic idea is parameters (valence and arousal) onto the music structural
to translate human brain activity into sound or music. parameters (see Figure 2, (a)). As for these mappings, we
The first premier of a Brain-Computer Music Interface employed a subset of functional relationships proposed by
was Alvin Lucier’s “Music for Solo Performer” presented Gomez and Danuser in 2007 [13]. Most importantly, these
in 1965 [8]. More recent works have been presented by parameters and mappings are implemented such that con-
Makeig et al. in 2011 [9], De Smedt and Menschaert in tinuous and seamless transitions between musical patterns
2012 [10], and most recently by Deuel et al. in 2017 of different emotional expressiveness can be generated
[11]. The majority of these works were focused on (see Figure 2 (right)).
artistic purposes, whereby the latter particularly raised Evaluation: We conducted a study to evaluate whether
the potential use of their system as a biofeedback device. subjects rate their emotional reponses according to the
valence- and arousal-settings of the composing unit when
Our concept: The core component of our concept - and being exposed to the corresponding generated musical
the major contrast to previous works - is the utilization pattern. 11 healthy subjects (age: 26.9±3.4, 7 males)
of a single parametrizable music synthesis algorithm (see participated in this study. All subjects were exposed to
Figure 1). This algorithm allows for generating musical 13 emotionally different musical patterns (uniformely dis-
sequences with seamless and continuous transitions tributed) generated by the composing unit and asked to
between patterns of different emotional expressiveness. rate their emotional responses with the Self-Assessment
As such it is universally applicable in both, the calibration Manikin (SAM)2 scheme. The results showed that the
(EEG-based affect modeling), and the online application subjectively rated emotional response highly correlate
phase (continuous EEG-based affect translation). The (Spearman correlation coeff.: r = 0.52 ± 0.23 for valence
major benefit is improved calibration-to-application and r = 0.68±0.19 for arousal) with the intended emotion
immediacy and transferrablity, but also high flexibility for to be expressed with the generated musical patterns. This
developing innovative stimulation/calibration protocols. suggested that the synthesized musical patterns generated
by the composing unit successfully express the intended
III. T ECHNICAL R EALIZATION AND E VALUATION emotions. The composing unit was utilized and further
A. Algorithm to generate synthesized affective music evaluated in yet another work by Hagerer et al. in 2015 on
(composing unit) augmentation of affective speech with synthesized music
[14].
Music can be considered a combination of multiple
harmonic, rhythmic and timbre components which change
1 For sound examples the reader is referred to the following webpage:
over time and thus form a musical piece. Based on that
http://web.ics.ei.tum.de/∼ehrlich/affectiveBCI/index.htm
principle, we designed and implemented an algorithm to 2 A questionnaire based on symbols to measure emotional affect; by
generate continuous patterns of affective music whose M. Bradley and P. Lang in 1994
B. Affective music BCI: Introducing the composing unit IV. D ISCUSSION AND FUTURE WORK
into an online BCI architecture Closing the loop allows BCI users to monitor their own
brain processes, which can trigger awareness, learning,
recovery or even enhancement of brain functionality
We embedded the composing unit into an online BCI
(neuroplasticity). Effective treatment protocols have
architecture, consisting of a calibration- and an online been developed for different neurological disorders,
application-phase. During the calibration phase (see Figure among others for stroke [15], and depression [16].
1 and 3, top) the subject is exposed to several musical pat- The concept of systematically investigating the human
terns generated by the composing unit, expressing different brain while being embedded in a BCI-loop has been
emotions. Brain activity is measured simultaneously via highlighted by Brunner et al. in 2015 as a promising key
EEG (14-channel emotiv EPOC system) and afterwards future field of BCI application [17] - utilizing BCI as a
used to build an emotion model: Selected brain activity is research/experimentation tool. Wander and Rao noted that
mapped onto control parameters for the composing unit by the most promising entry point to this avenue is the study
means of statistical modeling (details, see Figure 3). This of sensorimotor processing loops in motor imagery-based
model is then used during online application to translate BCIs [18]. A few studies have raised and discussed the
the subjects brain activity in real-time into parameter set- question of how the concept of closing the loop via sound
tings for the composing unit (see Figure 1 and 3, bottom). could be utilized for systematic investigation of the brain
The resulting outcome is a real-time representation of the in the loop or potential directions towards innovative
subjects affective state by means of a music-based emotion neurorehabilitation protocols [19]. To date, this approach
display. By playing back this musical representation, the is rather hypothetical and has not yet been systematically
subject’s affective state is influenced which again mani- explored.
fests in changes in the musical representation and so on:
An affective closed-loop interaction is established. With the work presented in this paper we aim to
Evaluation: In a second study, we investigated and set an entry point to this avenue. Musical stimuli permit
quantified the feedback modulations when subjects were the study of perception-action (sensorimotor) coupling
interacting with the music feedback. 5 healthy subjects as well as joint action and entrainment via reciprocal
(age: 27.8±5.0, all males) participated in this study for prediction and adaptation [20]. In future work, we are
two times (2 sessions) on different days. The subjects particularly interested in deepening the understanding
were asked to intentionally modulate the music feedback of human sensorimotor integration in which musical
in the closed-loop application of the system according sequences enable action upon the environment via
to specific tasks. The subjects were not given explicit entrainment. This could affect several populations who
information about how to achieve the modulations, but would benefit from the facilitation of sensorimotor
rather asked to develop individual mental strategies. The integration (the closed loop): for example those affected
results showed that 3 out of of 5 subjects achieved statis- by Parkinson’s disease [21] or congenital music disorders
tically significant modulations of the music feedback. In [22].
order to intentionally modulate the feedback, all subjects
stated to have retrieved emotional memories or imagined V. C ONCLUSION
happenings they look forward to in order to self-induce We have presented the concept, technical realization
emotions corresponding to the given modulation tasks. and evaluation of a non-invasive BCI to establish affective