Computer Games in Education
Computer Games in Education
Computer Games in Education
531
PS70CH22_Mayer ARI 8 November 2018 14:2
Contents
INTRODUCTION TO COMPUTER GAMES FOR EDUCATION . . . . . . . . . . . . . . 532
Objective and Rationale for Scientific Research on Computer Games
for Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 532
Theoretical Framework for Learning with Computer Games for Education . . . . . . . 533
Three Genres of Scientific Research on Computer Games for Education . . . . . . . . . . 535
VALUE-ADDED RESEARCH ON COMPUTER GAMES IN EDUCATION . . . . 537
Description and Example of Value-Added Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
Objective of Value-Added Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
Review of Value-Added Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 538
Limitations and Future Directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 539
COGNITIVE CONSEQUENCES RESEARCH ON COMPUTER
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
532 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
In contrast to this optimistic view of computer games for education, some game researchers
take a more cautious approach by raising the issue of the need for evidence to test these claims
(Blumberg 2014, Honey & Hilton 2011, O’Neil & Perez 2008, Tobias & Fletcher 2011, Wouters
& van Oostendorp 2017). For example, in a consensus report from the National Research Council,
Honey & Hilton (2011, p. 21) concluded: “There is relatively little research evidence on the
effectiveness of simulations and games for learning.” In another review, Mayer (2011b, p. 281)
lamented: “Many strong claims are made for the educational value of computer games, but there
is little strong evidence to back up those claims.”
In opting to take an evidence-based approach in this article, I seek to lay out the requirements
of scientific research methodologies that can address fundamental questions about game-based
learning and to summarize and systematize the existing research base on game-based learning. In
short, my approach to understanding computer games for education is to focus on evidence and
evidence-based theory rather than claims and promises.
The history of educational technologies—ranging from motion pictures in the early 1900s,
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
to radio in the 1930s, to educational television in the 1950s, to programmed instruction in the
1960s—is rife with cycles of extravagant claims followed by large-scale implementation in schools
followed by disappointing results and abandonment of the technology (Cuban 1986, Saettler 2004).
Time will tell whether computer games will also fit into this well-worn script for cutting-edge
educational technologies; however, today, we come equipped with research methods, evidence,
and learning theories that can help us avoid the patterns of the past. In this case, we are better
situated to use the tools of science to guide how we use the latest educational technology. This is
the ultimate goal of this article.
Specifically, this review examines three basic research questions concerning computer games
for education:
1. Which game features produce learning?
2. Do people learn anything useful from playing a computer game?
3. Do people learn academic material better from computer games or from conventional media?
In taking an evidence-based approach, this review addresses these questions by systematically
reviewing research evidence and evidence-based theories of how people learn.
Multimedia
Sensory memory Working memory Long-term memory
presentation
Prior
Integrating
knowledge
Figure 1
Cognitive model of multimedia learning. This model summarizes the cognitive processes and mental representations involved in
multimedia learning.
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
active cognitive processing during learning. Examples of this active cognitive processing include
attending to relevant incoming material (i.e., selecting), mentally organizing it into a coherent
representation (i.e., organizing), and connecting the incoming material with relevant existing
knowledge activated from long-term memory (i.e., integrating).
In Figure 1, the words and pictures in instructional material (such as a computer game deliv-
ered on the screen of a desktop computer, tablet, smartphone, or game console) enter the learner’s
visual sensory memory (through the eyes) and auditory sensory memory (thorough the ears), where
they quickly fade. If the learner attends to this fleeting information, some of it is transferred to
working memory for further processing (selecting). In working memory, the learner can mentally
organize the incoming words into a verbal model and the incoming images into a pictorial model
(organizing) and connect these verbal and pictorial representations with each other and with rele-
vant prior knowledge activated from long-term memory (integrating). The resulting constructed
knowledge is fit into long-term memory for storage.
When people play an educational computer game, they can allocate their limited processing
capacity among three kinds of cognitive processing: (a) Extraneous processing is cognitive pro-
cessing that does not serve the instructional objective and is caused by poor instructional design
(such as when a game has many distracting features), (b) essential processing is cognitive pro-
cessing needed to mentally represented the relevant material and is caused by the complexity of
the to-be-learned content, and (c) generative processing is cognitive processing aimed at trying
to make sense of the material and is caused by the player’s motivation to exert effort. Com-
puter games may be particularly helpful in fostering generative processing but are susceptible
to creating extraneous processing. The goal of the instructional design of educational computer
games is to minimize extraneous processing, guide essential processing, and foster generative
processing.
To better understand game-based learning, two important additional elements need to be
added to this cognitive model of multimedia learning: motivation and metacognition. First, mo-
tivation refers to the learner’s willingness to exert effort to learn the material and is defined as an
internal state that initiates and maintains goal-directed behavior (Mayer 2011a, 2014). In Figure 1,
motivation is the force behind the activation of each of the arrows—that is, the force that initiates
and maintains the processes of selecting, organizing, and integrating. An advantage of educational
computer games is their supposed motivating power. An educational game is motivating to the
extent that people opt to play it, persist in playing it, and exert effort to master it.
There are five classes of motivational theories that contribute to explaining the motivational
power of games (Mayer 2014, Wentzel & Miele 2016):
534 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
1. Interest and value theories: People exert effort to learn when they are interested in and find
personal importance in the learning task or material.
2. Self-efficacy and attribution theories: People exert effort to learn when they see themselves
as competent for the task and believe that their efforts will lead to success.
3. Goal orientation theory: People exert effort to learn when their goal is to master the learning
task or material.
4. Self-determination and intrinsic motivation theories: People exert effort to learn when they
feel control over the learning task and when they experience internal rewards.
5. Social cue and embodiment theories: People exert effort when they experience a social
partnership with the instructor and when they can use their whole body during learning.
Although progress is being made in incorporating motivational and affective processes into
the cognitive theory of multimedia learning, as reflected in Moreno & Mayer’s (2007) Cognitive
Affective Model of Learning with Media (CAMLM), more work is needed to build an integrated
model of game-based learning.
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
Second, metacognition refers to learners’ awareness and control of their cognitive processing
during learning (Mayer 2011a). Per Figure 1, metacognition refers to monitoring the cognitive
processes of selecting, organizing, and integrating, as well as coordinating these processes and ad-
justing them as needed to achieve the learning goal. A suggested advantage for game-based learning
is that players become self-regulated learners who take responsibility for monitoring and control-
ling their cognitive processing during game play. Research on game-based learning may be able to
shed light on how to better incorporate metacognitive processes into cognitive theories of learning.
Table 1 lists three genres of experimental studies on the instructional effectiveness of com-
puter games for education: value-added research, cognitive consequences research, and media
comparison research. In value-added research, the primary research question concerns which
game features cause improvements in student learning, that is, which features are the active in-
gredients responsible for fostering learning. To answer this research question, the control group
plays a base version of an educational game, whereas the experimental group plays the same game
with one feature added. We consider the added feature to be promising if the experimental group
achieves a higher posttest score on the academic material provided in the game than the control
group, with an effect size greater than or equal to 0.4. A looming challenge in value-added research
is to control for time on task when one group may be primed to engage in more activities than
another, thereby threatening the requirement of experimental control.
In cognitive consequences research, the primary research question concerns whether playing an
off-the-shelf computer game (i.e., experimental group) causes improvements in a targeted cognitive
skill relative to a control group that engages in a control activity such as playing a game that does not
appear to require the targeted skill (active control group) or not playing any game (passive control
group). We consider the game to be promising if the experimental group shows a greater pretest-
to-posttest gain on a targeted cognitive skill than the control group, with an effect size greater than
or equal to 0.4. A looming challenge in cognitive consequences research is to choose an appropriate
control activity and thus respect the requirement of experimental control. Cross-sectional studies
comparing expert and novice players violate the requirement of random assignment, making it
difficult to make causal attributions about any differences in cognitive skill performance.
In media comparison research, the primary research question concerns whether people learn
academic content better from playing games than from conventional media. To answer this ques-
tion, we compare the learning outcomes of an experimental group, which plays a game that contains
the targeted academic material, to those of a control group, which receives the same academic
material via conventional media such as a textbook lesson, an online narrated animation, or a
face-to-face slideshow presentation. We consider the game to be more effective than conven-
tional instruction if the experimental group outperforms the control group on learning outcome
measures, with an effect size greater than or equal to 0.4. However, even if the experimental and
control groups produce equivalent learning outcome performance, this may be considered support
for game-based learning in light of the proposal that students are more likely to initiate and persist
with playing a game than viewing a conventional lesson. A looming challenge in media comparison
research is to equate the experimental and control groups in terms of instructional content and
instructional method, which reflects the requirement of experimental control (Clark 2001).
536 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Figure 2
Access provided by University of Tampere on 11/01/19. For personal use only.
Screenshot from Design-a-Plant, an educational computer game aimed at teaching environmental science
principles about plant growth in different environments.
For each genre of game research, it is worthwhile to examine the boundary conditions for the
effects, such as whether the size of the effect differs for different kinds of learners, academic content,
or learning contexts. The next three sections of this review examine the evidence generated by
each of the three genres of educational computer game research.
outcome scores by at least an average of 0.4 standard deviations across at least five experiments).
Based on a review by Mayer (2014), Table 2 lists five promising features.
First, across nine out of nine experiments all conducted in the same lab with Design-a-Plant
(Moreno & Mayer 2002, Moreno et al. 2001), students learned better in a computer game when
words were spoken rather than printed on the screen, yielding an average effect size of 1.4.
Second, across eight out of eight experiments (Cordova & Lepper 1996; Moreno & Mayer 2000,
2004; Wang et al. 2008), students learned better in a computer game when words were presented
in conversational style (e.g., using first- and second-person constructions involving terms like “I,”
“you,” or “your”) than formal style (e.g., using third-person constructions), yielding an average
effect size of 1.5.
Third, across seven out of seven experiments (de Jong et al. 1999, Fiorella & Mayer 2012,
Leutner 1993, Mayer et al. 2002, Swaak et al. 1998), providing students with pregame information
such as the names and descriptions of the key concepts in the lesson resulted in better learning
posttest scores, yielding an average effect size of 0.8.
Fourth, across 12 out of 15 experiments (Adams & Clark 2014, Cameron & Dwyer 2005,
Goldberg & Cannon-Bowers 2015, Leutner 1993, Mayer & Johnson 2010, Moreno & Mayer
2005, Serge et al. 2013, ter Vrugte et al. 2015, Van Eck & Dempsey 2002, Vandercruysee et al.
2016), providing advice or explanative feedback during the game resulted in better learning posttest
scores, yielding an average effect size of 0.7.
Fifth, across 13 of 16 experiments (Adams & Clark 2014, Clark et al. 2016, Fiorella & Mayer
2012, Hsu & Tsai 2013, Hsu et al. 2016, Johnson & Mayer 2010, Lee & Chen 2009, Mayer &
Johnson 2010, Moreno & Mayer 2005, O’Neil et al. 2014, Pilegard & Mayer 2016, ter Vrugte
et al. 2015), providing prompts for players to write or select explanations during the game resulted
in better learning posttest scores, yielding an average effect size of 0.5. The effect was stronger
for college students than for younger students, presumably because of the challenges of engaging
in self-explanation.
We can also consider three unpromising features, i.e., features that do not boost learning
outcome scores by at least an average of 0.4 standard deviations across at least five experiments.
First, one game feature that has attracted research attention is the level of realism. In a review,
Wouters & van Oostendorp (2017) reported a strong effect size greater than 1 favoring cartoon-like
representations rather than photo-realistic representations based on 10 comparisons. Similarly,
Mayer (2014) reported that, in five out of six comparisons, students learned better when games
were rendered on a desktop computer screen than in immersive virtual reality, with a median effect
size of −0.1. A straightforward conclusion is that realism added for its own sake is not a promising
game feature when the goal is to improve learning outcomes.
538 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
Other game features that have attracted research attention include collaboration (i.e., playing
in dyads or groups versus playing individually) and narrative theme (i.e., playing a game that
has a strong story line versus playing one that does not). However, in a review, Wouters & van
Oostendorp (2017) reported a negligible effect size of under d = 0.2 for collaboration based on
18 comparisons and for narrative theme based on 9 observations. A straightforward conclusion is
that the research evidence base does not yet justify adding collaboration or narrative theme when
the goal is to greatly improve learning, although these features do not appear to harm learning
and may be helpful for certain kinds of learners.
based on different instructional goals. Work is also needed to pinpoint the conditions under which
Access provided by University of Tampere on 11/01/19. For personal use only.
each feature is most helpful, e.g., for which kinds of learners or instructional objectives. Finally,
work is needed to help explain how the features work; that is, we need research on the cognitive and
motivational processes underlying game-based learning. This line of research requires techniques
for measuring cognitive and motivational processes during learning, including eye tracking, phys-
iological measurements, and cognitive neuroscience measures such as electroencephalography or
functional magnetic resonance imaging. In summary, continued value-added game research offers
the potential to create a powerful research base, pinpoint boundary conditions for each promising
feature, and help us understand how features affect the learning process.
Figure 3
Screenshot from Portal, a game that can be used to teach physics principles about force and motion.
These are examples of cognitive consequences research, the goal of which is to determine
whether playing an off-the-shelf (or custom-designed) computer game can improve educationally
relevant cognitive skills or competencies. In one case, game playing does not appear to produce
positive effects, but in another it does, so in this section I investigate when game playing has
positive consequences.
Figure 4
Screenshots from All You Can ET, an educational computer game aimed at improving students’ executive
function skill in switching between cognitive tasks.
540 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
computer games affect which kinds of cognitive skills. Cognitive consequences research has im-
plications for the choice of computer games that have positive impacts on educationally relevant
skills performed outside the game environment. According to the theory of specific transfer of
general skill, the best chance for positive cognitive consequences of game playing occurs when
the cognitive test evaluated outside the game environment taps a skill that was repeatedly exer-
cised within the game in a variety of contexts and at increasing levels of challenge (Anderson &
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Bavelier 2011, Mayer 2014, Sims & Mayer 2002, Singley & Anderson 1989). In short, in cognitive
Access provided by University of Tampere on 11/01/19. For personal use only.
consequences research, we look for computer games that appear to require players to exercise an
important cognitive skill.
least five experiments): brain training games with perceptual attention, brain training games with
spatial cognition, spatial puzzle games with perceptual attention, spatial puzzle games with spatial
cognition, real-time strategy games with perceptual attention, and real-time strategy games with
executive function.
Brain training games such as Lumosity and CogMed contain a suite of mini-games intended to
improve performance on basic cognitive tests such as memory, attention, and spatial cognition.
However, reviews conclude that there is no convincing evidence to show that these kinds of
brain training games are successful (Bainbridge & Mayer 2018, Melby-Lervåg & Hulme 2012,
Simons et al. 2016). One problem may be that these games involve a collection of different games
aimed at different cognitive skills rather than promoting repeated practice on a single, focused
skill.
In contrast, there are several initial studies showing that playing brain training games that are
focused on specific executive function skills (such as switching from one task to another) can have
strong and consistent effects (Anguera et al. 2013, Nouchi et al. 2012, Parong et al. 2018); this
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
is an area that warrants further study. A potentially fruitful line of research involves examining
the cognitive consequences of playing computer games that are focused on a single cognitive
skill and that require repeated practice of that skill in varying contexts and at increasing levels of
challenge.
542 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
Figure 5
Screenshot from Decimal Point, an educational computer game aimed at teaching decimal fractions and
decimal arithmetic.
how to overcome a different possible misconception. By the time players have made their way
through Decimal Point Amusement Park, they have had a lot of practice in solving decimal
problems.
McLaren et al. (2017) found that sixth graders who were asked to play Decimal Point showed
significantly greater improvement on an immediate test (d = 0.4) and a delayed test (d = 0.4)
than did a group that learned from a conventional computer-based tutorial lesson covering the
same problems but in a different format than an arcade game. Importantly, students who played
the game reported much higher levels of enjoyment (d = 0.9), suggesting that they might be more
likely to initiate playing the game on their own and persist with it than students who received the
conventional computer-based tutorial lesson. This is an example of media comparison research
because we compare the learning outcomes of students who learn academic content from a game
to those of students who learn the same content from conventional media.
Table 4 Three disciplines in which playing games may be more effective than conventional
instruction
Discipline Experiments in which the effect is observed Effect size
Science 12 out of 16 experiments 0.7
Mathematics 4 out of 6 experiments 0.5
Second language 4 out of 5 experiments 1.0
First, the most-studied educational discipline is science, in which learning by playing games
Access provided by University of Tampere on 11/01/19. For personal use only.
produced higher test scores than learning from conventional lessons in 12 out of 16 experiments,
with an average effect size of 0.7 (Adams et al. 2012, Anderson & Barnett 2011, Barab et al.
2009, Brom et al. 2011, Evans et al. 2008, Hickey et al. 2009, Hwang et al. 2012, Moreno et al.
2001, Parchman et al. 2000, Ricci et al. 1996, Swaak et al. 2004, Wrzesien & Raya 2010). The
conventional lessons included online tutorials, slideshow presentations, printed lessons, and face-
to-face lectures. In each of the four instances in which a conventional medium was more effective
than playing a game, the conventional medium involved computer-based instruction such as a
hypertext, a computer-based slideshow, or a computer-based tutorial. This suggests that caution
is necessary in assuming that computer games are always the most effective form of computer-based
science learning.
Second, in reviewing mathematics studies, games resulted in better learning than conventional
media in four out of six experiments, with an average effect size of 0.5 (Chang et al. 2012, Din
& Calao 2001, McLaren et al. 2017, Papastergiou 2009, Sindre et al. 2009, Van Eck & Dempsey
2002). The control groups received classroom instruction, computer-based lessons, or paper-based
worksheets. The effects tended to be greater for children than for college students.
Third, in four out of five experiments involving learning a second language, students learned
better from games than from traditional media, yielding an average effect size of 1.0 (Liu & Chu
2008, Neri et al. 2008, Segers & Verhoeven 2003, Suh et al. 2010, Yip & Kwan 2006). However,
the control group in each study involved classroom instruction, which may be hard to equate to
game-based learning in terms of content and method.
Insufficient numbers of experiments are available in social studies and English language arts,
although the existing evidence favors games in both disciplines (Mayer 2014).
544 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
activity than they are with conventional media. Finally, research is needed to determine how to
incorporate games most effectively into the regular classroom context.
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
CONCLUSION
In conclusion, this review provides examples of the benefits of applying the science of learning
to education. This review demonstrates the progress being made by value-added research, which
addresses the question of which game features promote learning; cognitive consequences research,
which addresses the question of what is learned by playing games; and media comparison research,
which addresses the question of whether people learn better from games than from conventional
media. The benefit to practice is that psychology can offer education ways to improve the instruc-
tional effectiveness of educational games (Mayer 2016). The benefit to theory is that education
can prompt psychology to enrich cognitive theories of learning to explain a broader set of learning
situations and to incorporate motivational processes (Mayer 2014). I will consider this review a
success to the extent that it stimulates research on game-based learning that is methodologically
sound, theoretically grounded, and educationally relevant.
DISCLOSURE STATEMENT
The author is not aware of any affiliations, memberships, funding, or financial holdings that might
be perceived as affecting the objectivity of this review.
ACKNOWLEDGMENTS
Preparation of this review was supported by grant N0001416112046 from the Office of Naval
Research and grant R305A150417 from the Institute of Education Sciences. This review is partially
based on Mayer (2014).
LITERATURE CITED
Adams DM, Clark DB. 2014. Integrating self-explanation functionality into a complex game environment:
keeping the gaming in motion. Comput. Educ. 73:149–59
Adams DM, Mayer RE, MacNamara A, Koening A, Wainess R. 2012. Narrative games for learning: testing
the discovery and narrative hypotheses. J. Educ. Psychol. 104:235–49
Adams DM, Pilegard C, Mayer RE. 2016. Evaluating the cognitive consequences of playing Portal for a short
duration. J. Educ. Comput. Res. 54:173–95
Anderson AF, Bavelier D. 2011. Action game play as a tool to enhance perception, attention, and cognition.
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
In Computer Games and Instruction, ed. S Tobias, JD Fletcher, pp. 307–33. Charlotte, NC: Inf. Age Publ.
Anderson J, Barnett M. 2011. Using video games to support pre-service elementary teachers learning of basic
physics principles. J. Sci. Educ. Technol. 20:347–62
Anguera JA, Boccanfuso J, Rintoul JL, Al-Hashimi O, Faraji F, et al. 2013. Video game training enhances
cognitive control in older adults. Nature 501(7465):97–101
Bainbridge K, Mayer RE. 2018. Shining the light of research on Lumosity. J. Cogn. Enhanc. 2:43–62
Barab SA, Scott B, Siyahhan S, Goldstone R, Ingram-Goble A, et al. 2009. Transformational play as a curricular
scaffold: using videogames to support science education. J. Sci. Educ. Technol. 18:305–20
Bediou B, Adams DM, Mayer RE, Tipton E, Green CS, Bavelier D. 2018. Meta-analysis of action video game
impact on perceptual, attentional, and cognitive skills. Psychol. Bull. 144(1):77–110
Blumberg FC, ed. 2014. Learning by Playing. Oxford, UK: Oxford Univ. Press
Boot WR, Kramer AF, Simons DJ, Fabian M, Gratton G. 2008. The effects of video game playing on attention,
memory, and executive control. Acta Psychol. 129:387–98
Brom C, Preuss M, Klement D. 2011. Are educational computer micro-games engaging and effective for
knowledge acquisition at high schools? A quasi-experimental study. Comput. Educ. 57:1971–88
Cameron B, Dwyer F. 2005. The effect of online gaming, cognition and feedback type in facilitating delayed
achievement of different learning objectives. J. Interact. Learn. Res. 16:243–58
Chang K-E, Wu L-J, Weng S-E, Sung Y-T. 2012. Embedding game-based problem-solving phase into
problem-posing system for mathematics learning. Comput. Educ. 58:775–86
Clark DB, Virk SS, Barnes J, Adams DM. 2016. Self-explanation and digital games: adaptively increasing
abstraction. Comput. Educ. 103:28–43
Clark RE. 2001. Learning from Media. Charlotte, NC: Inf. Age Publ.
Cohen J. 1988. Statistical Power Analysis for the Behavioral Sciences. Mahwah, NJ: Lawrence Erlbaum. 2nd ed.
Cordova DI, Lepper MR. 1996. Intrinsic motivation and the process of learning: beneficial effects of contex-
tualization, personalization, and choice. J. Educ. Psychol. 88:715–30
Cuban L. 1986. Teaching and Machines: The Classroom Use of Technology since 1920. New York: Teach. Coll.
Press
de Jong T, Martin E, Zamarro J, Esquembre F, Swaak J, van Joolingen WR. 1999. The integration of computer
simulation and learning support: an example from the physics domain of collisions. J. Res. Sci. Teach.
36:597–615
Din FS, Calao J. 2001. The effects of playing educational video games on kindergarten achievement. Child
Study J. 31:95–102
Evans KL, Yaron D, Leinhardt G. 2008. Learning stoichiometry: a comparison of text and multimedia formats.
Chem. Educ. Res. Pract. 9:208–18
Feng J, Spence I, Pratt J. 2007. Playing an action video game reduces gender differences in spatial cognition.
Psychol. Sci. 18(10):850–55
Fiorella L, Mayer RE. 2012. Paper-based aids for learning with a computer-based game. J. Educ. Psychol.
104:1074–82
546 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
Gee JP. 2007. Good Video Games and Good Learning. New York: Peter Lang
Goldberg B, Cannon-Bowers J. 2015. Feedback source modality effects on training outcomes in a serious
game: pedagogical agents make a difference. Comput. Hum. Behav. 52:1–11
Green CS, Bavelier D. 2003. Action video game modifies visual selective attention. Nature 423:534–38
Green CS, Bavelier D. 2006a. Effects of action video game playing on the spatial distribution of visuospatial
attention. J. Exp. Psychol. Hum. Percept. Perform. 32:1465–78
Green CS, Bavelier D. 2006b. Enumeration versus multiple object tracking: the case of action video game
players. Cognition 101:217–45
Green CS, Bavelier D. 2007. Action-video-game experience alters the spatial resolution of vision. Psychol. Sci.
18:88–94
Hattie J. 2009. Visible Learning. Abingdon, UK: Routledge
Hickey DT, Ingram-Goble AA, Jameson EM. 2009. Designing assessments and assessing designs in virtual
educational environments. J. Sci. Educ. Technol. 18:187–208
Honey MA, Hilton ML, eds. 2011. Learning Science through Computer Games and Simulations. Washington,
DC: Natl. Acad. Press
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
Hsu CY, Tsai CC. 2013. Examining the effects of combining self-explanation principles with an educational
game on learning science concepts. Interact. Learn. Environ. 21(2):104–15
Hsu CY, Tsai CC, Wang HY. 2016. Exploring the effects of integrating self-explanation into a multi-user
game on the acquisition of scientific concepts. Interact. Learn. Environ. 24(4):844–58
Hwang G-C, Wu P-H, Chen C-C. 2012. An online game approach for improving students’ learning perfor-
mance in web-based problem-solving activities. Comput. Educ. 59:1246–56
Johnson CI, Mayer RE. 2010. Adding the self-explanation principle to multimedia learning in a computer-
based game-like environment. Comput. Hum. Behav. 26:1246–52
Lee C, Chen M. 2009. A computer game as a context for non-routine mathematical problem solving: the
effects of type of question prompt and level of prior knowledge. Comput. Educ. 52:530–42
Leutner D. 1993. Guided discovery learning with computer-based simulation games: effects of adaptive and
non-adaptive instructional support. Learn. Instr. 3:113–32
Li R, Polat U, Makous W, Bavelier D. 2009. Enhancing the contrast sensitivity function through action video
game training. Nat. Neurosci. 12(5):549–51
Liu T, Chu Y. 2008. Using ubiquitous games in an English listening and speaking course: impact on learning
outcomes and motivation. Comput. Educ. 55:630–43
Loftus GR, Loftus EF. 1983. Mind at Play: The Psychology of Video Games. New York: Basic Books
Mayer RE. 2011a. Applying the Science of Learning. Boston: Pearson
Mayer RE. 2011b. Multimedia learning and games. In Computer Games and Instruction, ed. S Tobias,
JD Fletcher, pp. 281–306. Charlotte, NC: Inf. Age Publ.
Mayer RE. 2014. Computer Games for Learning: An Evidence-Based Approach. Cambridge, MA: MIT Press
Mayer RE. 2016. What should be the role of computer games in education? Policy Insights Behav. Brain Sci.
3(1):20–26
Mayer RE, Johnson CI. 2010. Adding instructional features that promote learning in a game-like environment.
J. Educ. Comput. Res. 42:241–65
Mayer RE, Mautone PD, Prothero W. 2002. Pictorial aids for learning by doing in a multimedia geology
simulation game. J. Educ. Psychol. 94:171–85
McGonigal J. 2011. Reality Is Broken: Why Games Make Us Better and How They Can Change the World. New
York: Penguin Press
McLaren BM, Adams D, Mayer R, Forlizzi J. 2017. Decimal point: an educational game that benefits mathe-
matics learning more than a conventional approach. Int. J. Game-Based Learn. 7(1):36–56
Melby-Lervåg M, Hulme C. 2012. Is working memory training effective? A meta-analytic review. Dev. Psychol.
49(2):270–91
Moreno R, Mayer RE. 2000. Engaging students in active learning: the case for personalized multimedia
messages. J. Educ. Psychol. 92:724–33
Moreno R, Mayer RE. 2004. Personalized messages that promote science learning in virtual environments.
J. Educ. Psychol. 96:165–73
Moreno R, Mayer RE. 2005. Role of guidance, reflection, and interactivity in an agent-based multimedia
game. J. Educ. Psychol. 97:117–28
Moreno R, Mayer RE. 2007. Interactive multimodal learning environments. Educ. Psychol. Rev. 19:309–26
Moreno R, Mayer RE, Spires HA, Lester J. 2001. The case for social agency in computer-based teaching: Do
students learn more deeply when they interact with animated pedagogical agents? Cogn. Instr. 19:177–213
Moreno RE, Mayer RE. 2002. Learning science in virtual reality environments: role of methods and media.
J. Educ. Psychol. 94:598–610
Nelson RA, Strachan I. 2009. Action and puzzle video games prime different speed/accuracy tradeoffs.
Perception 38(11):1678–87
Neri A, Mich O, Gerosa M, Giuliani D. 2008. The effectiveness of computer assisted training for foreign
language learning by children. Comput. Assist. Lang. Learn. 21:393–408
Nouchi R, Yasuyuki T, Takeuchi H, Hashizume H, Akitsuki Y, et al. 2012. Brain training game improves exec-
utive functions and processing speed in the elderly: a randomized controlled trial. PLOS ONE 7(1):e29676
Okagaki L, Frensch PA. 1994. Effects of video game playing on measures of spatial performance: gender effects
in late adolescence. J. Appl. Dev. Psychol. 15:33–58
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
O’Neil HF, Chung G, Kerr D, Vendlinski T, Bushchang R, Mayer RE. 2014. Adding self-explanation prompts
to an educational game. Comput. Hum. Behav. 30:23–28
O’Neil HF, Perez RS, eds. 2008. Computer Games and Team and Individual Learning. Amsterdam: Elsevier
Papastergiou M. 2009. Digital game-based learning in high school computer science education: impact on
educational effectiveness and student motivation. Comput. Educ. 52:1–12
Parchman SW, Ellis JA, Christinaz D, Vogel M. 2000. An evaluation of three computer-based instructional
strategies in basic electricity and electronics training. Mil. Psychol. 12:73–87
Parong J, Mayer RE, Fiorella L, MacNamara A, Homer BD, Plass JL. 2018. Learning executive function skills
by playing focused video games. Contemp. Educ. Psychol. 51:141–51
Pellegrino JW, Hilton ML. 2012. Education for Life and Work: Developing Transferable Knowledge and Skills in
the 21st Century. Washington, DC: Natl. Acad. Press
Phye GD, Robinson DH, Levin J, eds. 2005. Empirical Methods for Evaluating Educational Interventions.
Amsterdam: Elsevier
Pilegard C, Mayer RE. 2016. Improving academic learning from computer-based narrative games. Contemp.
Educ. Psychol. 44:12–20
Pilegard C, Mayer RE. 2018. Game over for Tetris as a platform for cognitive skill training. Contemp. Educ.
Psychol. 54:29–41
Powers KL, Brooks PJ. 2014. Evaluating the specificity effects of video game training. In Learning by Playing:
Frontiers of Video Gaming in Education, ed. F Blumberg, pp. 302–29. Oxford, UK: Oxford Univ. Press
Prensky M. 2006. Don’t Bother Me Mom—I’m Learning. St. Paul, MN: Paragon House
Ricci KE, Salas E, Cannon-Bowers JA. 1996. Do computer-based games facilitate knowledge acquisition and
retention? Mil. Psychol. 8:295–307
Saettler P. 2004. The Evolution of American Educational Technology. Charlotte, NC: Inf. Age Publ.
Segers E, Verhoeven L. 2003. Effects of vocabulary training by computer in kindergarten. J. Comput. Assist.
Learn. 19:557–66.
Serge SR, Priest HA, Durlach PJ, Johnson CI. 2013. The effects of static and adaptive performance feedback
in game-based training. Comput. Hum. Behav. 29(3):1150–58
Shavelson RJ, Towne L, eds. 2002. Scientific Research in Education. Washington, DC: Natl. Acad. Press
Simons DJ, Boot WR, Charness N, Gathercole SE, Chabris CF, et al. 2016. Do brain training programs
work? Psychol. Sci. Public Interest 17(3):103–86
Sims VK, Mayer RE. 2002. Domain specificity of spatial expertise: the case of video game players. Appl. Cogn.
Psychol. 16:97–115
Sindre G, Natvig L, Jahre M. 2009. Experimental validation of the learning effect for a pedagogical game on
computer fundamentals. IEEE Trans. Educ. 52:10–18
Singley MK, Anderson JR. 1989. The Transfer of Cognitive Skill. Cambridge, MA: Harvard Univ. Press
Suh S, Kim SW, Kim NJ. 2010. Effectiveness of MMORPG-based instruction in elementary English education
in Korea. J. Comput. Assist. Learn. 26:370–78
548 Mayer
PS70CH22_Mayer ARI 8 November 2018 14:2
Swaak J, de Jong T, van Joolingen WR. 2004. The effects of discovery learning and expository instruction on
the acquisition of definitional and intuitive knowledge. J. Comput. Assist. Learn. 20:225–34
Swaak J, van Joolingen WR, de Jong T. 1998. Supporting simulation-based learning: the effects of model
progression and assignments on definitional and intuitive knowledge. Learn. Instr. 8:235–52
ter Vrugte J, de Jong T, Wouters P, Vandercruysse S, Elen J, van Oostendorp H. 2015. When a game supports
prevocational math education but integrated reflection does not. J. Comput. Assist. Learn. 31(5):462–80
Tobias S, Fletcher JD, eds. 2011. Computer Games and Instruction. Charlotte, NC: Inf. Age Publ.
Van Eck R, Dempsey J. 2002. The effect of competition and contextualized advisement on the transfer of
mathematics skills in a computer-based instructional simulation game. Educ. Technol. Res. Dev. 50:23–41
Vandercruysse S, ter Vrugte J, de Jong T, Wouters P, van Oostendorp H, et al. 2016. The effectiveness of a
math game: the impact of integrating conceptual clarification as support. Comput. Hum. Behav. 64:21–33
Wang N, Johnson WL, Mayer RE, Rizzo P, Shaw E, Collins H. 2008. The politeness effect: pedagogical
agents and learning outcomes. Int. J. Hum. Comput. Stud. 66:96–12
Wang P, Liu HH, Zhu XT, Meng T, Li HJ, Zuo XN. 2017. Action video game training for healthy adults: a
meta-analytic study. Front. Psychol. 7:907
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
Wentzel KR, Miele DB, eds. 2016. Handbook of Motivation at School. Abingdon, UK: Routledge. 2nd ed.
Wouters P, van Oostendorp H. 2017. Overview of instructional techniques to facilitate learning and motivation
of serious games. In Instructional Techniques to Facilitate Learning and Motivation of Serious Games, ed.
P Wouters, H van Oostendorp, pp. 1–16. Berlin: Springer
Wrzesien M, Raya MA. 2010. Learning in serious virtual worlds: evaluation of learning effectiveness and
appeal to students in the E-Junior project. Comput. Educ. 55:178–87
Wu S, Cheng CK, Feng J, D’Angelo L, Alain C, Spence I. 2012. Playing a first-person shooter video game
induces neuroplastic change. J. Cogn. Neurosci. 24(6):1286–93
Yip FWM, Kwan ACM. 2006. Online vocabulary games as a tool for teaching and learning English vocabulary.
Educ. Media Int. 43:233–49
Annual Review of
Psychology
Contents
Interview with Shelley E. Taylor
Shelley E. Taylor and Susan T. Fiske p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 1
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
vi
PS70_FrontMatter ARI 10 November 2018 11:59
Contents vii
PS70_FrontMatter ARI 10 November 2018 11:59
A New Era of HIV Risk: It’s Not What You Know, It’s Who You Know
(and How Infectious)
Andrew C. Cortopassi, Redd Driver, Lisa A. Eaton, and Seth C. Kalichman p p p p p p p p p p p 673
Stress and Obesity
A. Janet Tomiyama p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 703
The Emotion Process: Event Appraisal and Component Differentiation
Klaus R. Scherer and Agnes Moors p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 719
How to Do a Systematic Review: A Best Practice Guide for Conducting
and Reporting Narrative Reviews, Meta-Analyses, and Meta-Syntheses
Andy P. Siddaway, Alex M. Wood, and Larry V. Hedges p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 747
Annu. Rev. Psychol. 2019.70:531-549. Downloaded from www.annualreviews.org
Access provided by University of Tampere on 11/01/19. For personal use only.
Indexes
Errata
viii Contents