Novack & Goldin-Meadow (2015)
Novack & Goldin-Meadow (2015)
Novack & Goldin-Meadow (2015)
DOI 10.1007/s10648-015-9325-3
R E V I E W A RT I C L E
Abstract When people talk, they gesture, and those gestures often reveal information that
cannot be found in speech. Learners are no exception. A learner’s gestures can index moments
of conceptual instability, and teachers can make use of those gestures to gain access into a
student’s thinking. Learners can also discover novel ideas from the gestures they produce
during a lesson or from the gestures they see their teachers produce. Gesture thus has the
power not only to reflect a learner’s understanding of a problem but also to change that
understanding. This review explores how gesture supports learning across development and
ends by offering suggestions for ways in which gesture can be recruited in educational settings.
Introduction
When people talk, they gesture—in fact, they cannot help it. Gesture is so pervasive that
people routinely do it on the phone when no one can see them. Even congenitally blind
individuals, who have never seen anyone gesture, move their hands when they speak (Iverson
and Goldin-Meadow 1998). Gestures add a spatial or imagistic component to spoken lan-
guage, and since gestures are not confined to the linear, ruled-based system of spoken
language, they have the potential to express ideas that may be difficult to convey in words.
Since gestures occur most often in communicative contexts, we might guess that gesture’s
main purpose is to aid communication. While true to a certain extent, gesture also has
important consequences for thinking and learning beyond comprehension of language. More
than just clarifying or enhancing the message of a lesson, gesture can lead learners to insight
and promote conceptual development. The act of gesturing can also feedback and have effects
* Miriam Novack
mnovack1@uchicago.edu
1
Department of Psychology, University of Chicago, 5848 S. University Ave., Chicago, IL 60637, USA
406 Educ Psychol Rev (2015) 27:405–412
on gesturers themselves. This paper explores how these functions of gesture can be harnessed
to support learning and teaching across development.
The gestures that learners produce while explaining their reasoning can provide unique insight
into their thought processes. Imagine, for example, a child who does not yet understand the
concept of conservation of liquid quantity and believes that the amount of water changes when
it is poured from a tall, thin container into a short, fat container. When asked to justify this
belief, the child might say, Bthis one is taller than this one,^ while at the same time, producing a
C-shaped gesture indicating the narrow width of the tall container, followed by a wider C-
shaped gesture indicating the larger width of the short container. The child is highlighting one
dimension of the containers in speech (height), but his hands make it clear that he is beginning
to think about a second dimension (width). His gesture is conveying different information than
his words.
Learners who produce different information in gesture than in speech are revealing, for all
the world to see, that they know more than they say. The information a learner conveys
uniquely in gesture is often encapsulated or implicit knowledge, not yet accessible to explicit
understanding (Alibali and Goldin-Meadow 1993). Yet, since gesture is not bound by the
conventions of spoken language, it can provide an alternative avenue through which the
learner can explore new and alternative hypotheses. Producing these new ideas in gesture is
an excellent signal that the learner is in a transitional state and ready to make use of relevant
input (Goldin-Meadow 2003). More specifically, children whose gestures convey different
ideas from their speech when they explain a task are more likely to profit from instruction in
that task than children whose gestures are redundant with their speech (Church and Goldin-
Meadow 1986; Perry et al. 1988). Thus, gesture is a red flag—a marker that a learner is in a
prime state for learning.
But are these subtle cues only accessible to trained researchers, painstakingly coding and re-
coding learner’s gestures? The answer is, thankfully, no. Teachers, as well as college under-
graduates, are sensitive to the content conveyed uniquely in students’ gestures. Untrained
adults are able to see beyond children’s speech in assessing their knowledge (Alibali et al.
1997). If asked to instruct children who do, and do not, convey information uniquely in
gesture, adults are more likely to provide rich instruction to children whose gestures differ
from their speech than to children whose gestures match their speech (Goldin-Meadow and
Singer 2003), suggesting that adults are aware (although not necessarily consciously) that
gesture is a tool they can use to predict who will profit from instruction.
Since gesture provides an avenue through which learners can consider new ideas, what
happens to children who do not gesture? Might they be missing out on important learning
opportunities? It turns out that simply encouraging learners to gesture can allow these implicit
ideas to surface. Broaders et al. (2007) asked children to explain their solutions to incorrectly
solved mathematical problems; they then asked them to solve a new set of comparable
problems and encouraged half the children to gesture as they explained their solutions.
Educ Psychol Rev (2015) 27:405–412 407
Broaders et al. found that children told to gesture added novel strategies to their repertoires, but
those strategies were found only in gesture; children who were not encouraged to gesture did
not add strategies to their repertoires in either gesture or speech. Importantly, children who
added novel strategies uniquely in gesture were also more likely to profit from instruction in
math—after the lesson (when they were no longer gesturing), the children were able to solve
math problems on a paper-and-pencil test that they could not solve before the lesson.
Encouraging children to move their hands can activate implicit, and correct, ideas that then
prime children for learning.
The same phenomenon has recently been replicated in the domain of moral reasoning.
Children who were encouraged to gesture while explaining their reasoning about a moral
dilemma (debating contractual obligations against obedience to a higher authority) were more
likely to express multiple perspectives in gesture (reflecting a greater understanding of the
multiple elements involved in the dilemma) than children who were not encouraged to gesture.
Moreover, after a lesson in moral understanding (when children were no longer gesturing at
high rates), the children who had gestured during the lesson increased the number of
perspectives they mentioned in speech; in fact, the more multiple-perspectives they produced
in gesture prior to the lesson, the more multiple-perspectives they produced in speech after the
lesson (Beaudoin-Ryan and Goldin-Meadow 2014). Encouraging children to move their hands
activates implicit ideas that prime children for learning not only in spatial domains, like math,
but also in inherently nonspatial domains, like morality.
If spontaneously producing correct strategies uniquely in gesture leads to learning,
what happens if children are explicitly taught to produce correct strategies in gesture?
Goldin-Meadow et al. (2009) gave third-graders instruction in how to solve a missing
addend mathematical problem, such as 3+4+9=__+9, by teaching them to produce
gestures that represented a correct strategy for solving the problem. All of the children
were taught to say the words, BI want to make one side equal to the other side^
(equivalence, a correct strategy for solving the problem). Children in one group (the
fully correct gesture condition) were taught to produce another correct strategy (group-
ing) in gesture—they produced a V-point gesture with their index and middle fingers to
the first two numbers in a math problem (i.e., the 3 and 4 in the 3+4+9=__+9
problem) and then pointed at the blank on the other side of the equation; the V-point
gesture, which represents the idea that the problem can be solved by grouping and then
adding the two addends, is one that children spontaneously produce when explaining
their correct solutions to these problems (Perry et al. 1988). Children in another group
(the partially correct gesture condition) were taught to produce a partially correct
grouping strategy in gesture—they produced their V-point to the second two numbers
in the problem (i.e., the 4 and 9 in the example) and then pointed at the blank; this
gesture highlighted grouping but focused children’s attention on the wrong numbers. A
final group of children was taught no gestures at all and just learned the equivalence
strategy in speech (the speech alone condition). Encouraging children to produce the
grouping gesture during the lesson led them to produce the strategy explicitly in speech
after the lesson and to solve more problems correctly on a written posttest (when they
were no longer gesturing) than they had solved prior to the lesson. Interestingly, even
children in the partially correct gesture condition, whose attention was directed to the
wrong numbers, improved relative to children in the speech alone condition (although
not as often as children in the fully correct gesture condition), suggesting that gesture
was doing more than just directing the child’s attention. If all gesture did was direct
408 Educ Psychol Rev (2015) 27:405–412
attention, then children in the partially correct gesture condition should have performed
worse than children in the speech alone condition. Teaching children to produce correct
strategies uniquely in gesture can lead to learning.
How does gesture promote learning? There are likely many mechanisms through which
gesture has its effects. For example, gesture can link abstract concepts in the immediate
environment (Alibali et al. 2014), gesture can reduce cognitive load ( Goldin-Meadow et al.
2001; Hu et al. 2015; Ping and Goldin-Meadow 2010), and gesture can enhance spoken
communication (Hostetter 2011). In addition, since gesture is an act of the body, its effects on
learning may stem, at least in part, from its capacity to engage the motor system (see Ping et al.
2014). Motor experience has been shown to shape learning in a variety of domains (e.g.,
Glenberg et al. 2007; Smith 2005; Sommerville et al. 2005; Wiedenbauer and Jansen-Osmann
2008), and gesture, which is a motoric act, may rely on similar mechanisms. New fMRI
evidence finds that children who produce speech and gesture when learning about mathematics
problems are more likely to later activate motor regions when just passively solving the math
problems in a scanner than children who produce only speech when learning about the same
problems (Wakefield et al. 2015). This pattern of activation largely overlaps with neural
networks involved in learning through action on objects (cf. James and Swain 2011;
Johnson-Frey 2004). Gesture may thus support learning because it is a type of action.
However, gesture is representational action, making it different from action on objects,
which is meant to carry out functions, not represent ideas. This difference may be precisely the
feature that is responsible for gesture’s unique effects on learning. To explore this idea, Novack
et al. (2014) taught children how to solve math problems while either producing actions on
objects or producing gestures over objects. All children learned how to solve the types of
problems on which they were trained, but only children who gestured were able to transfer
their understanding to novel problem types, that is, to generalize their knowledge, an essential
component of learning. Action, which involves manipulating objects, might lead children to
think that their learned actions are relevant only to those objects themselves, resulting in
shallow learning (see McNeil and Uttal 2009). In contrast, gesture, which occurs off objects,
provides a physical distance, which may be critical for abstracting away from a particular
context and generalizing to new contexts. In other words, action may hinder generalization by
focusing learners on details that get in the way of transfer, whereas gesture may facilitate
generalization by focusing learners on dimensions that lead to transfer. Thus, although gesture
may work the way it does in part because it is itself an action, the fact that it is representational
action may be key to the role it can play in promoting deep learning.
Gesture can also support learning when children see a teacher gesture during instruction and do
not produce gestures of their own (Cook et al. 2013; Church et al. 2004; Perry et al. 1995;
Valenzeno et al. 2003). For example, the pointing and tracing gestures that teachers use to
indicate the symmetry of shapes helps preschoolers learn the concept of bilateral symmetry
(Valenzeno et al. 2003). This effect might grow out of gesture’s ability to ground the abstract
language of the lesson to the concrete physical environment (Valenzeno et al. 2003). But,
seeing gesture can also support learning in the absence of physical objects. Ping and Goldin-
Meadow (2008) gave 5- to 7-year-olds instruction in Piagetian conservation, either with
gestures or without gestures; in addition, the teacher’s gestures, which represented the relative
Educ Psychol Rev (2015) 27:405–412 409
widths and heights of the two glasses, were produced either in the presence of the objects to
which they referred or in the absence of those objects. Children who received instruction with
gesture improved more than children who received instruction without gesture, whether or not
objects were present. This finding suggests that gesture instruction supports learning, not only
by focusing learners’ attention to objects but also by conveying ideas through its representa-
tional form.
Finally, including gesture in instruction allows teachers to provide students with multiple
strategies at the same time. Singer and Goldin-Meadow (2005) found that children learned
more from a math lesson in which a teacher simultaneously presented two correct strategies,
one in speech and another in gesture (speech + gesture), compared to a lesson in which the
teacher presented the same two strategies entirely in speech, which, of course, had to be
produced sequentially (speech → speech). Gesture’s power may come, at least in part, from its
ability to co-occur with speech. If so, presenting the gesture strategy after the spoken strategy
(speech → gesture) should be less effective than presenting the gesture strategy along with the
spoken strategy (speech + gesture). Congdon et al. (2015) indeed found that speech → gesture
instruction was significantly less effective for both generalization and retention than speech +
gesture instruction and was, in fact, no better than speech → speech instruction.
Gesture not only is a useful tool for school-aged children learning about abstract concepts like
mathematics but can also support learning in infancy and young childhood. Babies as young as
4.5 months direct their attention to dynamic points (Rohlfing et al. 2012) and at 1 year benefit
from synchronous speech-pointing combinations in word-learning contexts (de Villiers Rader
and Zukow-Goldring 2012). Infants begin producing gestures themselves around 9 months,
even before they begin speaking (Bates 1976) and at 1 year use points to inform others, as well
as to retrieve information from adults (Tomasello et al. 2007; Kovács et al. 2014).
Just like older children, the gestures that infants and young children produce provide insight
into what they know, even before they can express that knowledge in words. Children will
begin to reference items in gesture when they are on the verge of being able to produce those
items in speech, and they produce supplemental speech + gesture combinations (i.e., pointing
to a cup and saying Bdaddy^ to mean Bdaddy’s cup^) right before they begin to create two-
word utterances in spoken language (Iverson and Goldin-Meadow 2005). Finally, children
who have not yet learned all of their number words are more accurate when labeling sets of
items in gesture than in speech (Gunderson et al. 2015).
Young children can also benefit from explicit gesture interventions, like school-
aged children. Eighteen-month-olds who are given pointing training increase the
number of gesture meanings they express during spontaneous interactions with their
caregivers, which, in turn, increases their spoken language vocabulary (LeBarton
et al. 2015). Toddlers are more likely to learn the concept of Bunder^ if given
instruction with gesture than if given instruction with pictures or just words
(McGregor et al. 2009). Finally, iconic gesture instruction (but not pointing instruc-
tion) teaches 2-year-old children how to operate a novel toy, suggesting that by
2 years of age, children can benefit from the representational structure of gesture in
a learning context (Novack et al. 2015). Learning from gesture is thus a pervasive
phenomenon across the lifespan.
410 Educ Psychol Rev (2015) 27:405–412
There remain many open questions regarding the relative effectiveness of gesture in
instruction, and as such, work is still needed to delineate the situations in which
gesture is, and is not, helpful in a learning context. For example, recent studies
have found that the effect of producing specific gestures during instruction may
interact with prior knowledge; gesture can be less useful in a domain (and, in some
cases, can even be detrimental) for children with low competence in that domain
(Post et al. 2013; Wakefield and James 2015). In addition, gesture’s usefulness may
change with age—children’s ability to learn from representational gesture during
instruction may be constrained by their ability to interpret symbolic forms. In fact,
Novack et al. (2015) found that although 2-year-olds were able to learn the function
of a novel toy from watching an iconic gesture demonstration, they learned more
from watching an incomplete-action demonstration, suggesting that the capacity to
learn from gesture, relative to other types of instruction, may change with age.
However, acknowledging that many open questions remain, we believe that the evidence
currently available showing that gesture can play a formative role in learning across develop-
ment should give teachers and educators the confidence to introduce gesture into their
classrooms in several ways.
First, teachers can give students opportunities to gesture by asking them to explain their
answers to problems (which tends to spontaneously elicit gesture) and by explicitly encour-
aging them to use their hands during these explanations. Teachers also need to pay attention to
the information students convey in gesture. If teachers pay attention only to what students say
with their mouths, they will miss the knowledge that students display uniquely in their hands.
Since the information students convey in gesture is often at the cutting edge of their
knowledge, if teachers attend to those gestures, they will be better able to adapt their
instruction to the skills of their students.
Second, teachers can gesture themselves when teaching. Teachers’ gestures can
guide children’s attention and scaffold verbal information. Teachers’ gestures can
also have a trickle down effect that might lead to increased student gesturing.
Research has found that when instructors gesture during a lesson, children are more
likely to gesture as well, which, in turn, leads them to profit from the lesson (Cook
and Goldin-Meadow 2006). By gesturing themselves, not only do teachers improve
the quality of a lesson but they also create a classroom culture that includes
gesturing.
Finally, gestures can be particularly helpful in a classroom setting when other
tools are not available. Gestures can be used to represent ideas that are challenging
to demonstrate (e.g., using the hands to represent molecules that are otherwise too
small to see, Stieff and Raje 2010). Gestures can also be used by children in place
of physical objects, like manipulatives. Unlike manipulatives, which can be cum-
bersome and likely have to remain in the classroom, hands travel with the learner.
Children can make use of their gestures wherever they are—in the classroom, at
home, and even on a test.
Gestures are portable, flexible, and ideal for improving learning contexts. They
can be teaching tools for teachers as well as learning tools for students. If recog-
nized as more than mere hand-waving, gestures have the potential to support
learning across development.
Educ Psychol Rev (2015) 27:405–412 411
Acknowledgments This work was supported by NIH grant number R01-HD047450 and NSF grant
number B6S-0925595 to Goldin-Meadow, NSF grant number SBE-0541957 (Spatial Intelligence and
Learning Center, Goldin-Meadow is a co-PI), and a grant from the Institute of Education Sciences
(R305 B090025) to S. Raudenbush in support of Novack.
References
Alibali, M. W., & Goldin-Meadow, S. (1993). Gesture-speech mismatch and mechanisms of learning: what the
hands reveal about a child’s state of mind. Cognitive Psychology, 25, 468–523. doi:10.1006/cogp.1993.
1012.
Alibali, M., Flevares, L., & Goldin-Meadow, S. (1997). Assessing knowledge conveyed in gesture:
do teachers have the upper hand? Journal of Educational Psychology, 89, 183–193. doi:10.
1037/0022-0663.89.1.183.
Alibali, M. W., Nathan, M. J., Wolfgram, M. S., Church, R. B., Jacobs, S. A., Johnson Martinez, C., & Knuth, E.
J. (2014). How teachers link ideas in mathematics instruction using speech and gesture: a corpus analysis.
Cognition and Instruction, 32, 65–100. doi:10.1080/07370008.2013.858161.
Bates, E. (1976). Language and context: the acquisition of pragmatics. N.Y.: Academic.
Beaudoin-Ryan, L., & Goldin-Meadow, S. (2014). Teaching moral reasoning through gesture. Developmental
Science, 6, 984–990. doi:10.1111/desc.12180.
Broaders, S., Cook, S. W., Mitchell, Z., & Goldin-Meadow, S. (2007). Making children gesture reveals implicit
knowledge and leads to learning. Journal of Experimental Psychology: General, 136, 539–550. doi:10.1037/
0096-3445.136.4.539.
Church, R. B., & Goldin-Meadow, S. (1986). The mismatch between gesture and speech as an index of
transitional knowledge. Cognition, 23, 43–71. doi:10.1016/0010-0277(86)90053-3.
Church, R. B., Ayman-Nolley, S., & Mahootian, S. (2004). The role of gesture in bilingual education: does
gesture enhance learning? International Journal of Bilingual Education & Bilingualism, 7, 303–320. doi:10.
1080/13670050408667815.
Congdon, E. L, Novack, M. A., Brooks, N, Hemani-Lopez, N., O’Keefe, L & Goldin-Meadow, S. (2015). Better
together: simultaneous presentation of speech and gesture in math instruction supports generalization and
retention. Manuscript submitted for publication.
Cook, S. W., & Goldin-Meadow, S. (2006). The role of gesture in learning: do children use their hands to change
their minds? Journal of Cognition and Development, 7, 211. doi:10.1207/s15327647jcd0702_4.
Cook, S. W., Duffy, R. G., & Fenn, K. M. (2013). Consolidation and transfer of learning after observing hand
gesture. Child Development, 84, 1863–1871. doi:10.1111/cdev.12097.
de Villiers Rader, N., & Zukow-Goldring, P. (2012). How the hands control attention during early word learning.
Gesture, 10, 202–221. doi:10.1075/gest.10.2-3.05rad.
Glenberg, A. M., Brown, M., & Levin, J. R. (2007). Enhancing comprehension in small reading groups using a
manipulation strategy. Contemporary Educational Psychology, 32, 389–399. doi:10.1016/j.cedpsych.2006.
03.001.
Goldin-Meadow, S. (2003). Hearing gesture: how our hands help us think. Cambridge: Harvard University
Press.
Goldin-Meadow, S., & Singer, M. A. (2003). From children’s hands to adults’ ears: gesture’s role in the learning
process. Developmental Psychology, 39, 509–520. doi:10.1037/0012-1649.39.3.509.
Goldin-Meadow, S., Nusbaum, H., Kelly, S. D., & Wagner, S. (2001). Explaining math: gesturing lightens the
load. Psychological Science, 12, 516–522. doi:10.1111/1467-9280.00395.
Goldin-Meadow, S., Cook, S. W., & Mitchell, Z. (2009). Gesture gives children new ideas about math.
Psychological Science, 20, 267–271. doi:10.1111/j.1467-9280.2009.02297.
Gunderson, E. A., Spaepen, E., Gibson, D., Goldin-Meadow, S., & Levine, S. C. (2015). Gesture as a window
onto children’s number knowledge. Cognition, 144. 14-28. doi:10.1016/j.cognition.2015.07.008
Hostetter, A. B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 137, 297–
315. doi:10.1037/a0022128.
Hu, F. T., Ginns, P., & Bobis, J. (2015). Getting the point: tracing worked examples enhances learning. Learning
and Instruction, 35, 85–93. doi:10.1016/j.learninstruc.2014.10.002.
Iverson, J. M., & Goldin-Meadow, S. (1998). Why people gesture as they speak. Nature, 396, 228. doi:10.1038/
24300.
Iverson, J. M., & Goldin-Meadow, S. (2005). Gesture paves the way for language development.
Psychological Science, 16, 368–371. doi:10.1111/j.0956-7976.2005.01542.x.
412 Educ Psychol Rev (2015) 27:405–412
James, K. H., & Swain, S. N. (2011). Only self-generated actions create sensori-motor systems in the developing
brain. Developmental Psychology, 14, 1–6. doi:10.1111/j.1467-7687.2010.01011.x.
Johnson-Frey, S. H. (2004). The neural basis of complex tool use in humans. Trends in Cognitive Sciences, 8,
227–237. doi:10.1016/j.tics.2003.12.002.
Kovács, Á. M., Tauzin, T., Téglás, E., Gergely, G., & Csibra, G. (2014). Pointing as epistemic request: 12-month-
olds point to receive new information. Infancy, 19, 543–557. doi:10.1111/infa.12060.
LeBarton, E. S., Goldin-Meadow, S., & Raudenbush, S. (2015). Experimentally induced increases in early
gesture lead to increases in spoken vocabulary. Journal of Cognition and Development, 16, 199–220. doi:10.
1080/15248372.2013.858041.
McGregor, K. K., Rohlfing, K. J., Bean, A., & Marschner, E. (2009). Gesture as a support for word learning: the
case of under. Journal of Child Language, 36, 807–828. doi:10.1017/S0305000908009173.
McNeil, N. M., & Uttal, D. H. (2009). Rethinking the use of concrete materials in learning: perspectives from
development and education. Child Development Perspectives, 3, 137–139. doi:10.1111/j.1750-8606.2009.
00093.x.
Novack, M. A., Congdon, E., Hemani-Lopez, N., & Goldin-Meadow, S. (2014). From action to abstraction: using the
hands to learn math. Psychological Science, 25, 903–910. doi:10.1177/0956797613518351.
Novack, M. A., Goldin-Meadow, S., & Woodward, A. (2015). Learning from gesture: how early does it happen?
Cognition, 142, 138–147. doi:10.1016/j.cognition.2015.05.018.
Perry, M., Church, R. B., & Goldin-Meadow, S. (1988). Transitional knowledge in the acquisition of concepts.
Cognitive Development, 3, 359–400. doi:10.1016/0885-2014(88)90021-4.
Perry, M., Berch, D., & Singleton, J. (1995). Constructing shared understanding: the role of nonverbal input in
learning contexts. Journal of Contemporary Legal Issues, 6, 213–235.
Ping, R. M., & Goldin-Meadow, S. (2008). Hands in the air: using ungrounded iconic gestures to teach children
conservation of quantity. Developmental Psychology, 44, 1277–1287. doi:10.1037/0012-1649.44.5.1277.
Ping, R., & Goldin-Meadow, S. (2010). Gesturing saves cognitive resources when talking about nonpresent
objects. Cognitive Science, 34, 602–619. doi:10.1111/j.1551-6709.2010.01102.x.
Ping, R., Goldin-Meadow, S., & Beilock, S. (2014). Understanding gesture: is the listener’s motor system
involved. Journal of Experimental Psychology: General, 143, 195–204. doi:10.1037/a0032246.
Post, L. S., van Gog, T., Paas, F., & Zwaan, R. A. (2013). Effects of simultaneously observing and making
gestures while studying grammar animations on cognitive load and learning. Computers in Human Behavior,
29, 1450–1455. doi:10.1016/j.chb.2013.01.005.
Rohlfing, K. J., Longo, M. R., & Bertenthal, B. I. (2012). Dynamic pointing triggers shifts of visual attention in
young infants. Developmental Science, 15, 426–435. doi:10.1111/j.1467-7687.2012.01139.x.
Singer, M. A., & Goldin-Meadow, S. (2005). Children learn when their teacher’s gestures and speech differ.
Psychological Science, 16, 85–89. doi:10.1111/j.0956-7976.2005.00786.x.
Smith, L. (2005). Cognition as a dynamic system: principles from embodiment. Developmental Psychology, 25,
278–298. doi:10.1016/j.dr.2005.11.001.
Sommerville, J. A., Woodward, A. L., & Needham, A. (2005). Action experience alters 3-month-old infants’
perception of others’ actions. Cognition, 96, B1–B11. doi:10.1016/j.cognition.2004.07.004.
Stieff, M., & Raje, S. (2010). Expert algorithmic and imagistic problem solving strategies in advanced chemistry.
Spatial Cognition & Computation, 10, 53–81. doi:10.1080/13875860903453332.
Tomasello, M., Carpenter, M., & Liszkowski, U. (2007). A new look at infant pointing. Child Development, 78,
705–722. doi:10.1111/j.1467-8624.2007.01025.x.
Valenzeno, L., Alibali, M. W., & Klatzky, R. (2003). Teachers’ gestures facilitate students’ learning: a lesson in
symmetry. Contemporary Educational Psychology, 28, 187–204. doi:10.1016/s0361-476x(02)00007-3.
Wakefield, E. M., & James, K. H. D2015]. Effects of learning with gesture on children’s understanding of a new
language concept. Developmental Psychology. doi:10.1037/a0039471.
online first.
Wakefield, E. M., Congdon, E., Novack, M. A., Goldin-Meadow, S., & James, K. H. (2015). Learning math by
hand: the neural effects of gesture-based instruction in 8-year-old children. Manuscript submitted for
publication.
Wiedenbauer, G., & Jansen-Osmann, P. (2008). Manual training of mental rotation in children. Learning and
Instruction, 18, 30–41. doi:10.1016/j.learninstruc.2006.09.009.
Copyright of Educational Psychology Review is the property of Springer Science & Business
Media B.V. and its content may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holder's express written permission. However, users may print,
download, or email articles for individual use.