Psychiatry Research 128 (2004) 235 – 244
www.elsevier.com/locate/psychres
Differences in facial expressions of four universal emotions
Christian G. Kohlera,*, Travis Turnerb, Neal M. Stolara, Warren B. Bilkerc,
Colleen M. Brensingerc, Raquel E. Gura, Ruben C. Gura
a
Neuropsychiatry, Department of Psychiatry, 10th Floor Gates Bldg., University of Pennsylvania, 3400 Spruce St.,
Philadelphia, PA 19104-4283, USA
b
Department of Psychology, University of California, San Diego, CA, USA
c
Department of Biostatistics and Epidemiology, University of Pennsylvania, Philadelphia, PA, USA
Received 27 June 2003; received in revised form 25 June 2004; accepted 10 July 2004
Abstract
The facial action coding system (FACS) was used to examine recognition rates in 105 healthy young men and women who
viewed 128 facial expressions of posed and evoked happy, sad, angry and fearful emotions in color photographs balanced for
gender and ethnicity of poser. Categorical analyses determined the specificity of individual action units for each emotion.
Relationships between recognition rates for different emotions and action units were evaluated using a logistic regression model.
Each emotion could be identified by a group of action units, characteristic to the emotion and distinct from other emotions.
Characteristic happy expressions comprised raised inner eyebrows, tightened lower eyelid, raised cheeks, upper lip raised and lip
corners turned upward. Recognition of happy faces was associated with raised cheek, lid tightening and raised outer brow.
Characteristic sad expressions comprised furrowed eyebrow, opened mouth with upper lip being raised, lip corners stretched and
turned down, and chin pulled up. Only lower brow and raised cheek were associated with sad recognition. Characteristic anger
expressions comprised lowered eyebrows, eyes wide open with tightened lower lid, lips exposing teeth and stretched lip corners.
Recognition of angry faces was associated with lowered eyebrows, raised upper lids and lower lip depression. Characteristic fear
expressions comprised eyes wide open, furrowed and raised eyebrows and stretched mouth. Recognition of fearful faces was
most highly associated with raised upper lip and nostril dilation, although both occurred infrequently, and with raised inner brow
and widened eyes. Comparisons are made with previous studies that used different facial stimuli.
D 2004 Elsevier Ireland Ltd. All rights reserved.
Keywords: Action unit; Facial emotion expression; Emotion recognition
1. Introduction
* Corresponding author. Tel.: +1 215 614 0161; fax: +1 215
662 7903.
E-mail address: kohler@bbl.med.upenn.edu (C.G. Kohler).
Facial expressions are used in humans and animals
for communication, in particular to convey one’s
emotional state (Darwin, 1965). This communication
can be reflexive, as situations may evoke emotions
0165-1781/$ - see front matter D 2004 Elsevier Ireland Ltd. All rights reserved.
doi:10.1016/j.psychres.2004.07.003
236
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
that are spontaneously expressed on the face. In other
instances, particularly in humans, facial expressions
may be volitional signals intended for communication
and may not reflect the true emotional state of the
person (Ekman and Friesen, 1975). Impairment in
emotional processing, specifically emotion recognition, has been described in psychiatric disorders
including schizophrenia, depression and bipolar disorder, and in neurological disorders (review: Kohler
et al., 2004). Since the earliest descriptions of
schizophrenia, decreased and muted facial expressions of emotions have been reported as a hallmark
of the illness; however, there have been few attempts
to investigate this impairment further in schizophrenia and other psychiatric disorders.
Six basic emotions—happiness, sadness, anger,
fear, disgust and surprise—and their corresponding
facial expressions are recognized across different
cultures (Huber, 1931; Eibl-Eibesfeldt, 1970; Izard,
1971; Ekman and Friesen, 1975). Descriptions have
been made about which facial muscles are involved in
the formation of each of the basic emotions (Huber,
1931; Plutchik, 1962; Ekman and Friesen, 1975;
Gosselin et al., 1997). For happy expressions, Ekman
and Friesen (1975) described facial expressions of
tense lower eyelids, raised cheeks and lip corners
pulled up; for sad expressions, inner eyebrows raised
and drawn together, and lip corners pulled down; for
anger expressions, lowered eyebrows drawn together,
tense lower eyelids, pressed lips or lips parted in a
square shape; for fear expressions, eyebrows raised
and drawn together, wide open eyes with tense lower
eyelids and stretched lips. Based on facial muscle
movement, Ekman and Friesen (1978) developed the
Facial Action Coding System (FACS) by identifying
the presence of specific actions of facial muscles
called Action Units (AUs). Gosselin et al. (1997)
tested Ekman and Friesen’s predictions about facial
expressions of six emotions in two conditions—posed
or unfelt and evoked or felt. In that study, actors used
two different methods of displaying facial expressions
of six emotions—trying to experience the target
emotion according to the Stanislawski technique,
while expressing the emotion (evoked emotion) or
merely displaying the emotion without the emotional
experience (posed emotion). FACS analyses of facial
expressions by a single rater revealed that AUs for
each emotion were concordant with Ekman and
Friesen’s descriptions. Occurrence rates of AUs for
evoked and posed facial expressions showed considerable overlap, in particular for happy and surprise
expressions. Other studies that investigated facial
landmark changes associated with emotional expression focused on measurement of muscle activity with
electromyography (EMG). Limitations of this methodology include that only select muscle groups, such
as corrugator supercilii orbicularis oculi and zygomaticus major have been measured, showing the
corrugator to be associated with sad and the zygomaticus with happy emotions (Schwartz et al., 1976).
Tassinary and Cacioppo (1989) elucidated that
expressions of action units involving brow and cheek
regions are associated with discrete facial muscle
activity as measured by surface EMG. More recently,
considerable overlap has been shown between surface
and intramuscular recordings of facial EMG during
happy, sad and angry expressions (Aniss and Sachdev,
1996).
In our study, three certified FACS raters examined
128 images of extremely happy, sad, angry and fearful
faces that were selected for use in a functional
imaging study and piloted for recognition in a group
of healthy subjects. Disgust was not included because
of our assumption that it may not present a pure
emotion, but rather a mixture of other universal
emotions (Kohler et al., 2003). Surprise was not
included because its valence depends entirely on the
triggering event and it can therefore be any of the
other emotions, with a rapid onset. The purpose of our
study was to investigate which facial changes are most
frequent in happy, sad, angry and fearful expressions,
and which facial changes are essential for accurate
recognition of the particular emotion. The study
included the following specific aims: (1) Which
action units characterize the different emotions? We
hypothesized that each emotion can be defined by the
presence of action units common to faces with the
particular emotion. (2) Which action units distinguish
different emotions from each other? We hypothesized
that facial expressions of each emotion consist of
unique action units that are distinct from other
emotions. (3) How do posed and evoked emotions
differ with respect to action units? We hypothesized
that different action units are used for the expression
of posed and evoked emotions. (4) Do men and
women utilize different action units for the expression
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
of emotions? We hypothesized that certain emotions,
in particular anger, are expressed differently by men
and women. (5) Which action units are associated
with recognition of each emotion? We hypothesized
that the presence of characteristic action units is
associated with proper detection of the particular
emotion.
We propose that findings based on accurate
descriptions of facial muscle groups in people without
psychiatric disorders will facilitate investigations into
the effects of psychiatric illness on facial emotion
expression in persons with psychiatric disorders. This
knowledge will lead to better understanding of how
interpersonal nonverbal communication is affected in
psychiatric disorders. In particular, this will give us
information about whether disorders, such as schizophrenia or affective disorders, are associated with
muted, but appropriate facial muscle movement or
recruitment of different muscle groups.
2. Methods
2.1. Task construction
Color slide photographs which were acquired
during a study of facial displays of emotion (Gur et
al., 2002a) and which totaled over 5000 pictures, were
digitally scanned. They were then processed in
PhotoShopn to remove background features and
reduce distinguishing hair and clothing cues. Evoked
and posed images from eight male and eight female
actors were selected for each emotion (Happy, Sad,
Anger, and Fear). Based on our finding that disgust
expressions, in particular of extreme intensity, were
poorly recognized (Kohler et al., 2003), we decided to
limit the current task to the universal emotions of
happiness, sadness, anger, and fear. Only extreme
intensity images and only faces with opened eyes that
were unaffected by spontaneous eye blinks were used
in this study. Faces used in our study were screened
by a group of raters (n=11), and only faces that were
correctly identified by N55% (6/11) of raters were
included in this task. Each actor (n=64) was assigned
to one emotion and photographed in both posed and
evoked expressions. Assignments were balanced for
gender, age, and race (Caucasian versus non-Caucasian). A total of 128 images of emotional expressions
237
(32 per emotion) were used in the emotion identification tasks. There was no planned overlap in the
stimuli used in the present study and recent studies
(Gur et al., 2002b,c; Kohler et al., 2003, 2004) that
employed stimuli from the archival set of over 5000
pictures. The facial stimuli in this project were piloted
for a functional magnetic resonance imaging (fMRI)
study of emotion recognition and face memory.
2.2. Emotion recognition testing
Healthy control subjects were recruited from
undergraduate introductory psychology courses at
Drexel University in Philadelphia. Testing sessions
were conducted in a closed on-campus computer
laboratory over a 1-week period. Participants completed a brief screening form that included demographic and health-related questions. The sample
consisted of 63 men (meanFS.D. age=19.7F1.63
years) and 42 women (meanFS.D. age=19.9F2.88
years) who met inclusion criteria (Erwin et al., 1992).
There were 71 Caucasian, 9 African-American, 4
Hispanic/Latino, and 21 Asian-American participants.
Subjects were asked to press one button if the face
displayed the target emotion, another button if the face
shown was neutral or displayed a different emotion.
Percentages of subjects making correct identifications
were used to generate recognition rates for each
image.
2.3. Facial action coding system ratings (FACS)
Images used in the emotion recognition test
(Section 2.2) were presented via digital video
projection to three certified FACS raters in pseudorandom order. To serve as a baseline comparison,
neutral images were presented next to emotional
images of the same person. FACS scoring was
performed independently by each rater. For purposes
of this study, we were interested in the presence rather
than the intensity of AUs, and intensity ratings were
not included in the analysis. Laterality was collapsed
so that if an AU was scored for one side; it was
qualified on both. Ratings were transformed to
binomial data, with the presence of an AU being
recorded with 1, absence with 0. According to the
agreement of at least two raters, an AU was rated as
present or absent. The number of occurrences for each
238
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
AU was calculated for each image, grouped by
emotion and only AUs that occurred in at least 10%
of the images in a particular emotion were included
for further analysis. AUs occurring in at least 25% of
the images in an emotion were considered to be
characteristic of the expression.
2.4. Data analysis
Fisher’s Exact tests were performed to determine
specificity of AUs, defined as uniquely present or
absent, for each emotion and to explore differences in
gender and condition (posed and evoked). To understand interactions between emotions and cluster
effects of AUs within emotions, co-occurrence rates
were calculated for all AU pairs with the following
formula: CoR of AU x with y=(# mutual occurrences
of AU x and y)/(# occurrences AU x)+(# occurrences
AU y). On the basis of the emotion recognition testing
from the 105 Drexel students, multivariable logistic
regression analyses with backward elimination were
performed to determine the impact of action units on
the odds of accurate identification, broken down by
evoked and posed conditions. The logistic regression
was fit using generalized estimating equations (GEE)
with an exchangeable correlation structure, in order to
account for the non-independence or clustering of the
multiple faces assessed by each student. The results
are expressed as odds ratios for comparing odds of
correct recognition when the AU was present vs.
absent.
To further examine the effect of characteristic
AUs on accurate recognition of Sad, Angry and
Fearful faces, odds ratios (ORs) were calculated for
recognition faces lacking one, two or three characteristic AUs for the respective emotion. For Happy
faces, this analysis could only be performed for
recognition of faces lacking one or two characteristic
action units.
3. Results
3.1. Characterization of emotional expressions
FACS ratings revealed separate profiles for Happy,
Sad, Anger and Fear expressions (Fig. 1, Table 1).
Characteristic, uniquely absent and present AUs were
found for each emotion. Expressions of Anger and
Sad shared the most characteristic AUs (5), while Fear
and Happy shared the fewest (2). Characteristic AUs
for Happy included in descending order of frequency
12, 7, 26, 6, 10, 1 and 25. AUs 6 and 12 were
uniquely present, while AUs 4 and 20 were uniquely
absent in Happy expressions. Characteristic AUs for
Sad included 4, 7, 20, 10, 17, 25, 1 and 15. AU 17
was unique for Sad expressions while AU 26 was
specifically absent. Characteristic AUs for Anger
Fig. 1. Facial expressions of emotion.
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
Table 1
FACS of emotions (number present of 32 faces per category)
AU Name
Fisher’s
Exacta
Happy Sad
Anger Fear
1
2
4
5
6
7
9
10
12
15
16
17
20
23
24
25
26
27
38
Pb0.001
Pb0.001
Pb0.001
Pb0.001
Pb0.001
Pb0.001
P=0.001
P=0.038
Pb0.001
P=0.050
Pb0.001
Pb0.001
n.s.
n.s.
n.s.
n.s.
Pb0.001
n.s.
P=0.006
9
7
2**
2
20*
23
2
17
32*
0
1
0
4**
0
0
8
23
0
0
1**
1
26
12
2
20
9*
17
1
6
16*
4
11
5
3
13
9
1
0
Inner Brow Raiser
Outer Brow Raiser
Brow Lower
Upper Lid Raiser
Cheek Raiser
Lid Tightener
Nose Wrinkler
Upper Lip Raiser
Lip Corner Puller
Lip Corner Depressor
Lower Lip Depressor
Chin Raiser
Lip Stretcher
Lip Tightener
Lip Pressor
Lips Part
Jaw Drop
Mouth Stretch
Nostril Dilator
10
0
24
1
3
21
0
14
3
8
0
12*
15
1
3
12
4**
0
0
18
13*
14
28
0
6**
0
4**
2
1
0
1
9
0
0
3
23
5
6*
n.s.= not significant.
a
df=3; all P values after Bonferroni correction.
* Unique qualifying AU.
** Unique disqualifying AU.
included AUs 4, 7, 10, 16, 25, 5, 20, 9 and 26. AUs 9
and 16 were uniquely present in Anger, while AU 1
was uniquely absent. Characteristic AUs for Fear
included AUs 5, 26, 1, 4, 2 and 20. AUs 5 and 2 were
uniquely present, while AUs 7 and 10 were uniquely
absent in Fear.
3.2. Co-occurrence of action units
AUs occurring together more than 75% of the time
were considered to be clustered. In Happy, we found
a clustering of AU 1 with 2 (88%), and AU 12 with 6
(77%), AU 7 (84%) and 26 (84%); in Sad, AU 4 with
7 (76%) and 20 (77%); in Anger, AU 4 with 7 (78%);
in Fear, AU 1 with 2 (77%), and AU 26 with 5
(78%).
3.3. Condition of expression
Effects of condition—evoked or posed—were only
significant in expressions of Anger. AU 16 (Fisher’s
Exact=8.00, df=1, P=0.012) and AU 20 (Fisher’s
Exact=11.22, df=1, P=0.002) were more frequent in
239
evoked than posed expressions. After correction for
multiple comparisons, the finding for AU 20 remained
significant.
3.4. Gender of face
No significant differences between male and
female posers were found for expressions of Happy,
Sad, Anger and Fear.
3.5. Recognition of emotions
Recognition rates for Happy faces were 91.2%, for
Sad faces 84.0%F10.6 (S.D.), for Anger faces
68.9%F24.3 (S.D.) and for Fear faces 67.9%F22.6
(S.D.). These rates are similar to recognition rates in
our previous publications, which employed different
faces from the same archival set of pictures and
different testing paradigms (Kohler et al., 2003, 2004).
3.6. Effect of condition on recognition of emotions
Recognition rates were similar for posed and
evoked expressions of Happy faces. Recognition rates
for Sad faces were 82.4% in the posed condition and
85.6% in the evoked condition (OR=0.79, P=0.01).
For example, this means that posed sad faces were
less likely—specifically 0.79 times as likely to be
correctly identified, when compared with evoked sad
faces. Recognition rates for Angry faces were 65.6%
in the posed and 72.2% in the evoked condition
(OR=0.74, Pb0.001). Recognition rates for Fearful
faces were 61.7% in the posed and 74.2% in the
evoked condition (OR=0.56, Pb0.001).
3.7. Relationship between recognition of emotion and
presence of AUs
The relationship between emotion recognition and
presence of AUs was assessed using multivariable
GEE logistic regression models, adjusting for the
multiple faces assessed by each rater. Happy: Since
AU 12 was always present, no correlation could be
calculated. Of the remaining characteristic AUs, the
presence of 6, 7, and 26 was positively associated
with happy recognition. The OR of 4.27 for AU 6
being present means that correct identification of
happy faces is more than four times greater when AU
240
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
6 is present than when AU 6 is absent. Of the noncharacteristic AUs, the presence of AU 2 was
positively and AU 20 was negatively associated with
happy recognition.
Sad: Of the characteristic AUs, 4, 17 and 25 were
positively associated; AUs 1, 7, 10 and 20 were not
correlated and AU 15 was negatively associated with
sad recognition.
Anger: Of the characteristic AUs, 4, 5 and 16 were
positively associated; AUs 7, 9, 10, 20, 25, and 26
were not associated with anger recognition. Of the
non-characteristic AUs, AUs 15 and 24 were pos-
itively, while the presence of AU 23 was negatively
associated with anger recognition. Anger was better
recognized in the posed than in the evoked condition.
Fear: Of the characteristic AUs, 5, 1 and 26 were
positively associated, AUs 2 and 20 were not
associated, and AU 4 was negatively associated with
fear recognition. Of the non-characteristic AUs, AU
10 and 38 were positively associated with fear
recognition. Fear was less often recognized in the
posed (OR=0.56, Pb0.001) than in the evoked
condition. The presence of AUs and significant ORs
are described in Table 2.
Table 2
Presence of action units and correct emotion recognition
AU
Happy
cAU
OR,
95%
Sad
cAU
OR,
95%
Anger
cAU
OR,
95%
Fear
cAU
OR,
95%
1
!
n.s.
CI
4
5
2.37,
1.65–3.40
!
n.s.
CI
!
1.88,
1.50–2.36
CI
!
7.27,
5.10–10.38
!
1.81,
1.48–2.22
CI
AU
Happy
cAU
OR,
95%
Sad
cAU
OR,
95%
Anger
cAU
OR,
95%
Fear
cAU
OR,
95%
2
!
3.01,
2.54–3.58
!
n.s.
!
0.65,
0.55–0.77
!
3.48,
2.45–4.95
16
17
20
23
!
1.41,
1.11–1.80
CI
!
4.27,
2.92–6.24
!
3.81,
2.93–4.95
!
n.s.
10
12
!
2.34,
1.81–3.04
!
n.s.
!
n.a.
!
n.s.
!
n.s.
!
n.s.
9
!
n.s.
15
!
0.60,
0.48–0.75
!
n.s.
2.18,
1.64–2.89
4.43,
3.01–6.52
24
!
n.s.
!
n.s.
CI
7
0.57,
0.43–0.78
CI
CI
6
25
26
!
n.s.
!
1.93,
1.41–2.65
27
38
!
1.38,
1.09–1.75
0.67,
0.55–0.82
3.03,
2.26–4.06
!
n.s.
!
n.s.
!
1.51,
1.31–1.74
1.61,
1.38–1.89
cAU=characteristic action unit. OR=odds ratios for recognition, when AU present (all P-values b0.001, except for AU25 in Sad: P=0.007).
n.a.=not applicable, could not be calculated. n.s.=not significant OR for recognition when characteristic AU present.
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
Table 3
Recognition rates and odds ratios for emotional faces without
characteristic action units (cAU)*
Happy
Sad
Anger
Fear
cAU
No cAUs
present
(%/OR)
1 cAU
present
(%/OR)
2 cAUs
present
(%/OR)
3 cAUs
present
(%/OR)
12,6,7
4,17,25
4,5,16
1,5,26
n.a./–
68.7/–
24.8/–
32.0/–
71.3/–
84.0/2.38
60.7/4.72
54.3/2.45
90.5/3.76
86.6/2.89
77.5/10.47
65.0/3.83
95.2/7.79
90.2/4.16
85.9/18.42
80.4/8.37
n.a. since AU 12 is always present. OR=odds ratio for recognition
of faces with one, two or three characteristic AU present for the
specific emotion when compared to faces without these characteristic AU (Sad, Anger and Fear) and one characteristic AU present
(Happy).
* Limited to cAU with positive correlations with recognition.
3.8. Relationship between recognition of emotion and
absence of characteristic AUs
In an effort to examine the recognizability of
emotional faces lacking key components—or characteristic AUs—for the specific emotion, we present
odds ratios for recognition of Sad, Angry and Fear
faces with none, one, two or three of the characteristic
AU present that are associated with recognition
(Section 3.6) (Table 3). In Happy faces, AU 12 was
always present and the effect of its absence on
recognition of Happy could not be examined.
4. Discussion
This study examined the presence of action units
(AUs) in four universal emotions, how these AUs
differ according to gender of poser and condition, and
the relationship between presence or absence of AUs
and recognition of the expressed emotion.
In Happy faces, we found characteristic expressions to consist of raised inner eyebrows, tightened
lower eyelid, raised cheeks, upper lip raised and lip
corners turned upward. Recognition of Happy faces
was associated with characteristic AUs, such as cheek
raised and lid tightening, and, although infrequently
present, with outer brow raised. Since lip corner pull
was present in all faces and lip stretch was negatively
was associated with recognition, we conclude that the
former is essential for the facial expression of
happiness.
241
In Sad faces, we found characteristic expressions to
consist of furrowed eyebrows, opened mouth with
upper lip being raised, lip corners stretched and turned
down, and pulled up chin. Of eight characteristic AUs
for Sad, only brow lower and chin raised was
associated with recognition, but these correlations
were much lower than what was found for other
emotions. It appears that most of the frequent AUs in
Sad were common to other emotions as well and did
not contribute to its recognition. In addition, lip corner
depression, which because of findings by Ekman and
Gosselin was thought to be essential to Sad, was
present in only 25% of Sad faces and was associated
negatively with recognition. In our set of faces, it
appears that recognition of Sad is less dependent on
the presence of single AUs, but on the combination of
AUs or the bGestaltQ of facial expression.
In Anger faces, we found characteristic expressions
to consist of furrowed—or lowered—eyebrows, eyes
wide open with tightened lower lid, raised upper and
turned lower lips exposing teeth, and stretched lip
corners. Recognition of Anger faces was associated
with only three of nine characteristic or frequent
AUs—brow lower, lower lip depression and upper lid
raised—but with three non-characteristic AUs.
Although infrequent in Anger faces, lips being
pressed together (9%) and lip corners turned down
(19%) was associated positively with recognition,
whereas tightened lips (16%) impeded recognition.
In Fear faces, we found characteristic expressions
to consist of eyes wide open with furrowed and raised
eyebrows and stretched mouth. Recognition of Fear
faces was associated most highly with upper lip raised
(13%) and to a lesser extent with nostril dilation
(19%), both of which occurred infrequently in Fear
faces. Of the characteristic AUs, fear recognition was
associated with inner—but not outer—brow being
raised and widened eyes and was impeded by brow
lower, a frequent AU (43%) in Fear.
In addition, we found in every emotion, except
Anger, opening of the mouth, as represented by lips
parted or jaw dropped, to correlate with improved
recognition.
Examining recognition of faces lacking one, two or
three characteristic AUs that are associated with
recognition of the specific emotion, we found Happy
and Sad faces to be well recognized with none or only
one of the characteristic AUs being present. This was
242
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
different for Anger and Fear faces, which were poorly
recognized when no characteristic AUs were present;
however, recognition improved markedly with the
presence of just one or two characteristic AUs.
In comparison with previous studies (Table 4),
there was considerable agreement about the presence
of AUs in each emotion as predicted by Ekman and
Friesen (1978, 1982) and as determined experimentally by Gosselin et al. (1997) and the present study.
Among the three groups, there is unanimous agreement for the presence of cheek raise and lip corner
pull in Happy; eyebrow lower and lip corner
depressed in Sad; eyes wide open, eyebrow lower
and mouth open in both Anger and Fear expressions.
Differences among the studies regarding which
AUs were found to be characteristic may be due to
different methodologies in how facial expressions
were obtained. In Gosselin’s study, actors were
presented with scenarios meant to evoke felt emotions, while in our study actors chose personal events.
The unfelt condition in Gosselin’s study used scenarios as well; in our posed condition, actors were simply
Table 4
Comparison of characteristic action units from three studies
AU
Happy
E
1
2
4
5
6
7
9
10
11
12
15
16
17
20
23
24
25
26
27
*
G
*
Sad
K
E
*
*
X
*
**
*
*
Anger
G
K
E
G
*
*F
*
*
*
*F
*
*
*
*
*
*
*
Fear
K
E
X
*
*
*
*
*
*
*
**
*
G
K
*
*
*
**
*
*
*F
X
X
*
*
*
**
*
*U
*
*
**
*
X
*U
*
*
*
*
*
*
*
X
*
*
*
*
*
*F
*U
*F
**
*
*
*F
*F
*
*
*
*
*
*
*
*
E=predictions by Ekman and Friesen (1978, 1982). G=findings by
Gosselin et al. (1997) (N10% occurrence). K=findings by Kohler
et al. *=frequently present AU. **=unique qualifying AU.
X=unique disqualifying AU. F=felt (evoked) condition. U=unfelt
(posed) condition.
instructed to convey a target emotion with their face.
Lastly, whereas Gosselin’s study used videotaped
acquisition, our study used still photos.
With respect to evoked versus posed conditions,
we were surprised to only find differences for anger,
where lip stretch was more common in the evoked
state. This finding is consistent with few differences
between felt and unfelt encoding conditions reported
in the Gosselin study and indicates that better
recognition of evoked Sad, Anger and Fear expressions result from differences in intensities of expression. Unexpectedly, differences for gender were
limited to mouth open being more commonly in male
sad faces. Overall, these findings offer support that
possible condition and gender related differences in
the expression of emotions are not based on qualitatively different facial expressions. Such differences,
if present, may be related to quantitative differences,
i.e. differences in intensities in facial expressions,
which were not investigated in our study.
An argument can be made that visual inspection of
action units has limited sensitivity to detect small and
fleeting facial changes and certainly microexpressions, which are defined as emotion specific muscle
patterns in the absence of overt expression (Tassinary
and Cacioppo, 1992). In our study, all expressions
were of extreme intensity to minimize below threshold activation of action units and ambiguous expressions. In addition, all expressions were judged to
represent the intended target expression during the
acquisition phase (Gur et al., 2002a,b,c) and selected
as a representative expression upon construction of
the fMRI task.
Another potential limitation is that photos capture
emotional expressions at a fixed point in time and the
possibility that the most valent expression or the
sequence of facial changes essential for the expression
of a particular emotion were not obtained. The
photographic stimuli used were carefully assembled
from an archive of over 150 people and over 5000
pictures that underwent standardized procedures for
expression of emotion, and images with closed eyes or
marked head tilt were not included.
In the present study, we did not compare intensity
ratings and laterality ratings. Our investigation examined whether facial changes were present, rather than
the degree of presence or the presence on one side of
the face. Previous research has shown posed and
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
deliberate emotions to include more asymmetric
muscle activity than evoked emotions (Ekman et al.,
1981, 1982) and the left side of the face to be more
expressive (Sackeim et al., 1978; Indersmitten and
Gur, 2003). Since we examined facial expressions of
extreme emotional intensity, we did not explore the
effect of intensity of emotional experience on facial
expression.
Future directions to evaluate facial expression of
emotions based on FACS in healthy persons and
persons with psychiatric disorders may explore issues
of data acquisition, selection of stimuli and methods
of facial ratings. Data acquisition with facial EMG
using multiple electrodes may yield more comprehensive and accurate findings regarding facial movements, in particular with respect to subtle movements,
chronological sequence and facial laterality of action
units.
Laterality of facial expressions of different intensity emotions and their effect on recognition can be
investigated by assembling right-right and left-left
composite faces. A potential limitation of assembling
composite faces is based in the natural asymmetry of
faces and the need for accurate three-planar alignment
of the images, which, if not performed correctly, can
produce odd and unnatural looking composites.
Since FACS represents a laborious and inexact
method to measure facial changes, attempts have been
made to create automated programs that measure facial
muscle movements. Cohn et al. (1999) and Bartlett et
al. (1999) reported on isolated and simple combinations of posed facial action units, but this analysis has
not been applied to emotional faces. These methodologies provide qualitative AU-based analyses during
the course of an expression; however, they are unable
to further quantify a change in facial expression.
Quantification of fine-grained structural—or pointwise—changes in the face, employing advanced
morphometric tools, is needed to capture the subtlety
of human expression. To pursue this aim, we have
developed a novel methodology for quantitative
analysis of facial expressions, which is capable of
measuring expansions and contractions of select facial
regions and their boundaries (Verma et al., in press).
In psychiatric disorders, qualitative and quantitative impairment of facial expression of emotions have
received limited attention, while a large body of
literature exists on emotion recognition deficits,
243
particularly in schizophrenia. In schizophrenia but
not affective illness, impaired facial expressions of
emotions constitute characteristic symptoms of the
illness. It remains to be investigated whether persons
with psychiatric illness, such as schizophrenia or
affective disorders, display muted facial expressions
of emotions or recruit facial muscle movements which
in people without mental illness are not particular to
the expression of the target emotion. While the former
will be associated with lack of recognition of the
expressed emotion, the latter will lead to misidentification or misattribution of another emotion with
pronounced effects on non-verbal communication.
Acknowledgments
This work was supported by NIMH MH01839 and
MH43880.
References
Aniss, A.M., Sachdev, P.S., 1996. Concordance between surface
and intra-muscular recordings of facial EMG during emotional
expression. Electromyography and Clinical Neurophysiology
36, 73 – 79.
Bartlett, M.S., Hager, J.C., Ekman, P., Sijnowski, T.J., 1999.
Measuring facial expressions by computer image analysis.
Psychophysiology 16, 253 – 263.
Cohn, J.F., Zlochower, A.J., Lien, J., Kanade, T., 1999.
Automated face analysis by feature point tracking has high
concurrent validity with manual FACS coding. Psychophysiology 36, 35 – 43.
Darwin, C., 1965. The Expression of Emotions in Man and
Animals. University of Chicago Press, Chicago.
Eibl-Eibesfeldt, I., 1970. Ethology, The Biology of Behavior. Holt,
Rinehart & Winston, New York.
Ekman, P., Friesen, W.V., 1975. Unmasking the Face. Prentice-Hall,
Englewood Cliffs, NJ.
Ekman, P., Friesen, W.V., 1978. Manual of the Facial Action Coding
System (FACS). Consulting Psychologists Press, Palo Alto, CA.
Ekman, P., Friesen, W.V., 1982. Felt, false, and miserable smiles.
Journal of Nonverbal Behavior 6, 238 – 252.
Erwin, R.J., Gur, R.C., Gur, R.E., Skolnick, B., Mawhinney-Hee,
M., Smailis, J., 1992. Facial emotion discrimination: I. Task
construction and behavioral findings in normal subjects.
Psychiatry Research 42, 231 – 240.
Gosselin, P., Kirouac, G., Doré, F.Y., 1997. Components and
recognition of facial expression in the communication of
emotion by actors. In: Ekman, P., Rosenberg, E. (Eds.), What
the Face Reveals: Basic and Applied Studies of Spontaneous
244
C.G. Kohler et al. / Psychiatry Research 128 (2004) 235–244
Expression Using the Facial Action Coding System (FACS).
Oxford University Press, New York, pp. 239 – 267.
Gur, R.C., Sara, R., Hagendoorn, M., Marom, O., Hughett, P.,
Macy, L., Turner, T., Bajcsy, R., Posner, A., Gur, R.E., 2002a. A
method for obtaining 3-dimensional facial expressions and its
standardization for use in neurocognitive studies. Journal of
Neuroscience Methods 115, 137 – 143.
Gur, R.C., Schroeder, L., Turner, T., McGrath, C., Chan, R.M.,
Turetsky, B.I., Alsop, D., Maldjian, J., Gur, R.E., 2002b. Brain
activation during facial emotion processing. NeuroImage 16,
651 – 662.
Gur, R.E., McGrath, C., Chan, R.M., Schroeder, L., Turner, T.,
Turetsky, B.I., Kohler, C., Alsop, D., Maldjian, J., Ragland,
J.D., Gur, R.C., 2002c. An fMRI study of facial emotion
processing in schizophrenia. American Journal of Psychiatry
159, 1992 – 1999.
Huber, E., 1931. Evolution of Facial Musculature and Facial
Expression. The Johns University Hopkins Press, Baltimore,
MD.
Indersmitten, T., Gur, R.C., 2003. Emotion processing in chimeric
faces: hemispheric asymmetries and expression and recognition
of emotions. Journal of Neuroscience 23, 3820 – 3825.
Izard, C.E., 1971. The Face of Emotion. Appleton-Century-Crofts,
New York.
Kohler, C.G., Turner, T.T., Bilker, W.B., Brensinger, C., Siegel, S.J.,
Kanes, S.J., Gur, R.E., Gur, R.C., 2003. Facial emotion
recognition in schizophrenia: intensity effects and error pattern.
American Journal of Psychiatry 160, 1168 – 1174.
Kohler, C.G., Turner, T.T., Gur, R.E., Gur, R.C., 2004. Recognition
of facial emotions in neuropsychiatric disorders. CNS Spectrums 9, 267 – 274.
Plutchik, R., 1962. The Emotions. Random House, New York.
Sackeim, H.A., Gur, R.E., Saucy, M.C., 1978. Emotions are
expressed more intensely on the left side of the face. Science
202, 433 – 435.
Schwartz, G.E., Fair, P.L., Salt, P., Mandel, M.R., Klerman,
G.L., 1976. Facial expressions and imagery in depression:
an electromyographic study. Psychosomatic Medicine 38,
337 – 347.
Tassinary, L.G., Cacioppo, J.T., 1989. A psychometric study of
surface electrode placements for facial electromyographic
recording: the brow and cheek muscle regions. Psychophysiology 26, 1 – 16.
Tassinary, L.G., Cacioppo, J.T., 1992. Unobservable facial actions
and emotion. Psychological Science 3, 28 – 33.
Verma, R., Davatzikos, C., Loughead, J., Indersmitten, T., Hu, R.,
Kohler, C., Gur, R.E., Gur, R.C., in press. Quantification of
facial expressions using high dimensional shape transformations. Journal of Neuroscience Methods.