College Teaching
ISSN: 8756-7555 (Print) 1930-8299 (Online) Journal homepage: https://www.tandfonline.com/loi/vcol20
The Room Itself is Not Enough: Student
Engagement in Active Learning Classrooms
Kelsey J. Metzger & David Langley
To cite this article: Kelsey J. Metzger & David Langley (2020) The Room Itself is Not Enough:
Student Engagement in Active Learning Classrooms, College Teaching, 68:3, 150-160, DOI:
10.1080/87567555.2020.1768357
To link to this article: https://doi.org/10.1080/87567555.2020.1768357
Published online: 01 Jun 2020.
Submit your article to this journal
Article views: 31
View related articles
View Crossmark data
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=vcol20
COLLEGE TEACHING
2020, VOL. 68, NO. 3, 150–160
https://doi.org/10.1080/87567555.2020.1768357
The Room Itself is Not Enough: Student Engagement in Active
Learning Classrooms
Kelsey J. Metzgera
and David Langleyb
a
University of Minnesota Rochester; bUniversity of Minnesota Twin Cities
ABSTRACT
KEYWORDS
The primary purpose of this paper is to describe the variety of active engagements that
characterize student behaviors in active learning classrooms (ALCs) across an undergraduate
degree program. The number of different engagement types observed during a single class
meeting varied between two and eight across 23 different courses. Three forms of engagement accounted for nearly 75% of all observed time, regardless of the subgroups (e.g., consistent in STEM vs Non-STEM courses, lower division vs upper division courses). In addition,
unique patterns of student engagement characterized the pedagogical “signature” of a
given course. We conclude that intensive class observation focused on student engagement
not only has value for comprehensive undergraduate program review but also serves as a
lever that invites faculty reflection on pedagogical practice toward course improvement.
Curriculum; educational
environments; pedagogy;
student behavior;
teaching methods
Introduction
Student engagement has occupied the attention of
researchers in higher education for decades. Chickering
and Gamson (1987) well-known “seven principles for
good practice in undergraduate education” contained
classroom behaviors that hinged on student engagement.
Pascarella and Terenzini’s (2005) exhaustive review of
the literature concluded that the level of student involvement in academic work directly influenced knowledge
acquisition and general cognitive development. Two
groundbreaking surveys–the National Survey of Student
Engagement (NSSE) and the Student Experience in the
Research University (SERU)–reaffirmed student engagement as a key proxy for student learning. Based on the
NSSE results, George Kuh and colleagues (Kuh 2008;
Kuh, O’Donnell, and Schneider 2017) identified High
Impact Practices (HIP’s) as key levers for student success. These practices – e.g., research with faculty, internships, and culminating senior experiences – induce high
levels of student engagement and are known to have
especially broad and significant positive impacts on student learning (Kilgo, Ezell Sheets, and Pascarella 2015).
Both research and the classroom experience of teachers
leave little doubt that student involvement in the subject
matter is the “grand meta principle” for student learning
(Cross 1993). Fink (2003) views active engagement as an
CONTACT Kelsey J. Metzger
Rochester, MN 55904, USA.
ß 2020 Taylor & Francis Group, LLC
kmetzger@umn.edu
umbrella term that has three components: a) gathering
information and ideas, b) experiences that include direct
involvement (doing) or indirect involvement (observing,
listening), and c) reflecting on what is being learned
and how one learns. Fink’s definition implies that student engagement can be directed by intentional instructional strategies.
In the classroom, “active learning” is a phrase used
to indicate instructional strategies that introduce student activity into traditional lecture courses and promote student engagement (Prince 2004, Bonwell and
Eisen 1991). Extensive literature has demonstrated the
significant positive impact on student learning and
retention that results from the implementation of active
learning pedagogies in the undergraduate classroom
(c.f. Freeman et al. 2014). Active learning strategies
have enabled students to outperform their predicted
performance extrapolated from previous performance
measures (Cotner et al. 2013), and results in a decrease
in achievement gaps between URM students and
majority students (Eddy and Hogan 2014). Active
learning strategies are associated with changes in student behaviors that correspond with increased academic success in the course (e.g., distributed studying),
and attitudinal shifts in which students in active learning classes perceive their class as a community (Eddy
and Hogan 2014). Moreover, active learning strategies
University of Minnesota Rochester, Center for Learning Innovation, 111 South Broadway,
COLLEGE TEACHING
enable students to interact closely and in meaningful
ways with faculty members in the classroom, increasing
the perception of instructor accessibility and promoting
relationship-building (Metzger 2015).
The enactment of these components by instructors is
not confined to a particular learning environment;
indeed, as Kuh, O’Donnell, and Schneider (2017) have
emphasized, HIPs are found both inside and outside the
classroom. It seems reasonable to assume, however, that
certain environments can be structured to leverage the
advantages of student engagement. We turn to one of
those environments below – the active learning classroom – and examine how their affordances have been
exploited to support active engagement by students in a
multidisciplinary undergraduate degree program.
Active learning classrooms as sites for leveraging
student engagement
Active learning classrooms (ALCs) are learning spaces
intentionally designed to promote student-centered,
collaborative, and technology-enhanced instruction
and learning (Beichner 2014). These spaces have a few
typical features:
… round or curved tables with moveable seating that
allow students to face each other and thus support
small-group work. The tables are often paired with
their own whiteboards for brainstorming and
diagramming. Many tables are linked to large LCD
displays so students can project their computer screens
to the group, and the instructor can choose a table’s
work to share with the entire class. Wireless Internet
plays an important role in retrieving resources and
linking to content management systems, and depending
upon the size of the room, table microphones can be
critical so that every student’s voice can be broadcast
across the room. (Baepler et al. 2016, 10)
Numerous studies, with many focusing on students
in STEM disciplines, have demonstrated significant,
positive student outcomes in ALCs operationalized as
meaningful student interactions, decreased failure
rates, and increased student satisfaction (Gaffney et al.
2008; Whiteside, Brooks, and Walker 2010). In particular, ALCs can affect the ways in which faculty and
students relate to one another; i.e., the social context
of the classroom can undergo beneficial change compared to traditional learning spaces (Walker and
Baepler 2017). Significant relative learning gains have
also been demonstrated when using active learning
strategies in low-technology SCALE-UP mock up
classrooms (i.e., low-tech ALC design with moveable
rectangular tables and portable whiteboards) (Soneral
and Wyse 2017) and when active learning strategies are
151
implemented in traditional, non-ALC settings (Stoltzfus
and Libarkin 2016; Soneral and Wyse 2017). The use
of interactive learning strategies has also been found to
decrease or eliminate the achievement gap between
male and female students in undergraduate physics
(Lorenzo, Crouch, and Mazur 2006). Although the current level of deployment of ALCs in higher education
is considered “experimental,” ALCs are predicted to
rise in pervasiveness to the level of “mainstream”
deployment by 2022 and were determined to be the
top strategic technology in 2017 by the EDUCAUSE
Center for Analysis and Research (2017).
Despite demonstrated significant, positive student
outcomes in ALCs, it does not directly follow that
increasing the number of ALC spaces in higher education will lead to the seamless adoption of instructional
strategies that promote more or deeper student engagement. As Daniel Wilson (2015) has observed, “space is
simply a context for human behavior.” Affordances
such as round tables and whiteboards merely present
opportunities for instructional design; learner-centered
teaching and subsequent student engagement are intentional acts. The physical architecture of ALCs is not a
causative mechanism that directly and invariantly
results in improved student learning (Brooks 2012). It
may be better to view the architecture and affordances
of ALCs as a potential “nudge” (Thaler and Sunstein
2009) for decision-making, i.e., encouraging faculty
action but ultimately leaving the choice to individual
faculty. A recent analysis of observations from over
2000 classes taught by more than 500 STEM faculty
across 25 institutions concluded that “ … simply providing infrastructure or small class size does not necessarily change instructional practices, as about half of
the classes with flexible seating and about half of the
small- and medium-size courses were classified as
didactic.” (Stains et al. 2018, 1468). While architecture
and affordances provide a nudge, the value of studentcentered classrooms such as ALCs hinges on the
alignment between a teacher’s epistemology and the
enactment of student-centered pedagogies (Lasry,
Charles, and Whittaker 2014). As Stoltzfus and Libarkin
(2016) conclude, “ … while SCALE-UP-type classrooms
may facilitate implementation of active learning, it is
the active learning and not the SCALE-UP infrastructure that enhances student performance” (1).
Student engagement and the University of
Minnesota Rochester
The vision for the [study location] since its inception
in 2006 is to provide a personalized, integrated,
152
K. J. METZGER AND D. LANGLEY
technology-enhanced, and evidence-based curriculum
in health sciences. The University of Minnesota
Rochester (UMR) offers two undergraduate degrees: a
Bachelor of Science in Health Sciences (BSHS), and a
junior-admitting Bachelor of Science in Health
Professions (BSHP), which is delivered in partnership
with the Mayo Clinic School of Health Sciences. The
design and implementation of the undergraduate
health sciences curriculum was vested in applying
innovative and emerging best practices in support of
student learning and development.
The academic home of the BSHS degree is a multidisciplinary unit, the Center for Learning Innovation
(CLI), which includes faculty from Biochemistry,
Biology, Chemistry, History, Literature, Mathematics,
Philosophy, Physics, Psychology, Public Health,
Sociology, Spanish, and Writing. At the time of writing,
this unit comprises 12 tenured/tenure-track faculty and
21 lecturers/teaching specialists whose primary academic
experiences, training, and research was in non-education
disciplines (e.g., biology, sociology, literature, philosophy). All faculty participate in teaching and support a
“high contact” faculty model to assist students both in
and outside of the classroom. Tenured/tenure-track faculty are primarily responsible for continuous and longitudinal assessment of student learning, which is often
collaborative and focuses on the realization of course,
curriculum, and/or institutional level learning and developmental outcomes. In position recruitment materials,
student-centered teaching is an explicitly stated ideal and
expectation for all faculty in the unit. For many faculty,
however, their UMR appointment is their first full-time
teaching opportunity in higher education post graduate
school/professional training.
In the Spring of 2017, UMR engaged in an extensive
academic program review with a specific emphasis on
pedagogy and curriculum. One part of the review
involved a systematic program of observation of student
engagement in courses across the BSHS curriculum. An
important consideration of these observations was that
all courses in the BSHS degree program are taught in
ALCs (Supplementary document 1: Example ALC layout
at UMR). This physical and curricular context provided
an extraordinary opportunity to examine a programwide curriculum built on the assumption of instructional
strategies that promote active learning engagement to
enable significant learning and development of students.
Data collection
Data acquisition on student performance and other
attributes is not only common at UMR, but an
assumption of enrollment for students. From its initiation as a campus, UMR has been committed to continuous investigation of student learning across all
courses and activities in the BSHS degree program.
While the current classroom observation project was
aligned with this institutional culture and programwide approach to interrogate variables that contribute
to student learning, the goals of the classroom observation project were not primarily research-driven.
Instead, the initial and primary goals were to a)
acknowledge and honor both creative and routine
work that signified student engagement in undergraduate coursework, b) provide feedback to faculty
on the type and range of student engagement to
improve pedagogy, and c) gain an initial perspective
on the varied and common forms of engagement
across the full degree program. To fulfill this goal, our
focus was to identify behaviors on a macro/categorical
level (e.g., discussion, problem solving) rather than on
specific classroom activities (e.g., the use of jigsaws,
brainstorming on a whiteboard).
A number of structured instruments have been
developed to facilitate classroom observations of student engagement and to determine the extent to
which faculty use student-centered approaches (e.g.,
Classroom Observation Protocol for Undergraduate
STEM (COPUS) (Smith et al. 2013) or the Teaching
Dimensions Observation Protocol (TDOP) (Hora,
Oleson, and Ferrare 2012). Both offered a finer grain
of analysis than needed for meeting the essential goals
of the current project, which were formative and
descriptive rather than quantitative and evaluative.
A broad literature base was consulted to derive different forms of classroom engagement that might be
present in university courses. For example, the protocol
for COPUS lists student behaviors such as listening,
problem solving, discussing, and others as codes for student engagement. Svinicki and McKeachie (2014) list
reading as another form of active engagement. Fink
(2003) includes activities such as observing and reflecting under the banner of active learning, but the identification of “doing” activities in authentic settings opens
the door to many additional categories such as performing (e.g., music, dance, theater) creating/constructing artifacts (e.g., studio arts), and designing or
planning (e.g., architecture, clothing design). A full list
of the ten possible student engagement formats used in
our observational protocol is shown in Table 1.
Not all of these formats were expected to be present
in observations at UMR, given its focus on the health
sciences and the limited number of degree programs.
Nevertheless, most forms are well-known to instructors
COLLEGE TEACHING
153
Table 1. Categories of engagement formats included in observation protocol.
Engagement Type
Creating/Constructing
Definition
building an artifact or product related to
professional practice
Examples
1.
2.
3.
Designing/Planning
preparing a scheme or approach to address
a problem
1.
2.
3.
Discussing
verbal or digital dialoguing with one or more
individuals
1.
2.
3.
Investigating/Problem Solving
conducting a detailed inquiry or exploration
through a systematic process
1.
2.
3.
4.
Listening/Processing
attentional focus on auditory information to gain
knowledge or understanding
1.
2.
3.
Observing
attentional focus on visual information directed
at an object or process
1.
2.
3.
Performing/Demonstrating/ Presenting
goal-oriented action aimed at achieving a task
within an applied setting
1.
2.
3.
4.
5.
Reading/Studying
information acquisition by directing attention to
text-based content
1.
2.
Reflecting
active, persistent, and careful consideration of
any belief or form of knowledge in light of
the grounds that support it (Dewey 1933)
1.
2.
3.
Writing (diagramming, drawing, keyboarding)
use of written or digital methods to analyze,
transcribe, or transform information
1.
2.
3.
4.
constructing a physical model of a virus in
microbiology lab
stitching together a new garment in an
apparel design course
completing a project in the studio arts
(e.g., painting, sculpture,
drawing, ceramics)
designing a mini-experiment to test a
theory in abnormal psychology
using AutoCad to design a new structure
in an architecture course
designing a prosthetic limb (digitally) prior
to its fabrication in a biomedical
engineering course
talking with peers in a think/pair/share
exercise in a literature course
small group discussion in a philosophy
course on the meaning of a text
practicing language skills with peers in a
Spanish course
combining two reactants to determine the
reaction rate in a general chemistry lab
locating information via a web search in a
public health course
using interviewing to conduct a qualitative
study in a sociology class
diagnosing the severity of a contusion in a
clinical nursing course
listening to a math instructor explain the
sequence for solving an equation
listening to a lecture in a history course
listening to a procedure being outlined by
the instructor in an immunology class
watching a physics instructor solve an
equation on the whiteboard
watching a video on children’s play in a
child development course
observing an instructor demonstrate a new
routine in a modern dance class
physical movement in a dance course or
other kinesiology activity class (e.g.,
tennis, basketball)
instrumental performance in a music
course (e.g., piano, violin, trumpet)
acting in a theater class
carrying out a public speaking assignment
in a speech class
practicing the delivery of anesthesia on a
simulator in dentistry
reading a text (book, article) to acquire
information in a gender and
sexuality course
reading or studying procedures for carrying
out a biology lab assignment
considering your beliefs about an ethical
dilemma prior to discussion in an ethics of
medicine course
thinking about how your MBTI score has
been influencing group dynamics in your
psychology lab section
contemplating how your choice of college
major has been affected by your family
of origin
drawing a figure or table on a whiteboard
writing out the steps to solve a
mathematical problem
prose writing in completing a report or as
an exercise in a writing studio course
taking notes on a lecture in a
communications course
154
K. J. METZGER AND D. LANGLEY
and include easily observable behaviors, as well as
behaviors that require more inference about their connection to engagement. Clear definitions were generated as well as broad examples that signified likely
behavior in each domain of engagement (Table 1).
All classroom observations were conducted by an
educational program specialist employed by the
University of Minnesota Center for Educational
Innovation (CEI). Classroom observations were scheduled with individual instructors of courses. An early
observation took place within the first 7–8 weeks of
the semester and was followed by a second observation during the latter half of the semester. Two observations were completed for each class involved in the
study with a singular exception in which only
oneearly observation took place. In some instances,
the early and late observation course sessions were
taughtby the same faculty member. In other instances
in which a course was taught by a team of instructors,
a different faculty member taught (or acted as lead
instructor for the session) during the early and late
observation session.
An observational protocol was designed to help
systematize the data collection. The observer was
seated in the classroom in a position to view the
majority of students in the class. At 15 second intervals, the observer noted the primary engagement
behavior (the central focus of student behavior in following the intent of the instructor) of half or more
students at that time and any secondary behavior simultaneously occurring (the “means” to the “end” - the
primary engagement). For example, if an instructor
prompted students to spend time on a practice problem, “problem solving” is the primary engagement
behavior; to accomplish that, students might discuss
with one another, write on a board, observe others
writing on a board, etc., which would be the secondary engagement behaviors.
At the conclusion of a given 15 second interval, a
“1” was entered into an Excel spreadsheet in the
appropriate column that designated the primary form
of engagement. This procedure continued until the
end of a 50 minute or 75 minute class. When the class
session concluded, the amount of time in a primary
engagement behavior was summed, and the overall
percent of course time spent in these formats was
compiled into a pie chart.
Finally, the observer engaged in debriefing dialog
with the instructor(s) and occasionally elicited the
instructor’s estimate of time spent in engagement formats prior to revealing the observational data and pie
chart summary. Debrief dialogues with instructors
varied in length and content, but were never used to
assess the quality or efficacy of the teaching and learning activities in the class session, or to question the
decision of the instructor to design the session as they
had. Rather, debrief dialogues were used only to
prompt reflection from the instructor, e.g., “is this
kind of engagement what you want?”. Notes on the
responses of instructors were taken and used to help
construct the qualitative portion of the current paper.
In the current project, we focus on the primary
behaviors for quantitative analysis, which secondary
behaviors providing additional context for debriefing
discussions with faculty.
We stress again that the formative and descriptive
goals of the project remained foremost. Nevertheless,
we adopted particular systematic practices for this
project, consistent with the intent of research instruments. Clear definitions and examples of engagement,
a systematic recording interval, and the ability to summarize the observations in a meaningful way across
all courses remained essential features of the observation endeavor.
Results
In total, 45 observations across 23 different courses
were conducted, including 28 observations of 14 lower
division courses (1000–2000 course number designators) and 17 observations of nine upper division
courses (3000–4000 course number designators).
Examined from the standpoint of distinguishing
STEM or non-STEM courses, the observations can be
described as 21 observations in 11 STEM courses and
24 observations in 12 non-STEM courses. An observation was conducted for each course in the first half of
the semester, with a second observation of the same
course in the second half of the semester. Of the ten
possible engagement behaviors included in the observation protocol, eight were observed, while two –
Creating/Constructing and Performing/Presenting –
were never observed. The number of different engagements observed during any one class meeting ranged
from two to eight, with an average number of engagements across all courses of 4.9. This finding was generally consistent across upper division and lower
division courses (average 4.6 in lower division, 5.3 in
upper division courses) and across STEM and nonSTEM courses (4.6 in STEM courses, 4.3 in nonSTEM courses). The engagement profiles resulting
from the early and late session observations were
occasionally consistent (Figure 1(a)). However, variability in student engagement profiles was generally
COLLEGE TEACHING
155
Figure 1. Engagement behavior from selected courses.
a) Pie charts comparing very similar profiles from same instructor from early and late observation period.
b) Pie charts contrasting very different profiles from same instructor from early and late observation period.
c) Pie charts demonstrating very different profiles for course Instructor A and course Instructor B from early and late observation
period of the same course.
156
K. J. METZGER AND D. LANGLEY
Figure 2. Summary of top engagement formats observed across all courses and observation sessions. Figure 2 displays the mean
percent of time spent in the top five student engagements observed across lower division courses (n ¼ 14, 28 observations), upper
division courses (n ¼ 9, 17 observations), STEM courses (n ¼ 11, observations ¼ 21), Non-STEM courses (n ¼ 12, 24 observations),
and Overall (n ¼ 23, observations ¼ 45).
observed between the early semester observation vs.
late semester observation(Figure 1(b)), between early
and late observations of two different instructors in
the same class (Figure 1(c)), and certainly between
different courses.
Despite the variability in engagements observed,
three engagement behaviors - Listening/Processing,
Discussing, and Problem Solving - accounted for
approximately 74% of all student time observed.
Across all courses, the most frequently and extensively observed student engagement behavior was
Listening/Processing, which comprised, on average,
36% of all student time, with a range during single
class meetings from 0% to 79%. Listening/Processing
represented seven of the top ten highest percentages
of engagement across all courses, and was the highest percentage of engagement time in 24 of 45 class
meetings observed. A summary of the top five student engagement formats observed across the
courses and class meetings observed is displayed in
Figure 2.
Table 2. Most frequently observed student engagements.
Listening
Discussion
Reading
Writing/Drawing
Observing
Lower
Division
Upper
Division
STEM
Non-STEM
Overall
27/28
27/28
20/28
17/28
18/28
17/17
17/17
16/17
14/17
12/17
20/21
20/21
14/21
16/21
16/21
24/24
24/24
22/24
15/24
14/24
44/45
44/45
36/45
31/45
30/45
The number of class observations in which the
engagement formats were observed out of the total
number of classroom observations is displayed above.
These general trends in amount and types of student
behaviors were consistent across both STEM and nonSTEM courses, as well as across both upper division
and lower division courses (Table 2). The finding that
engagement profiles are similar across upper and
lower division courses is consistent with the results of
a recent large-scale investigation characterizing student engagement in undergraduate STEM that found
no significant differencesin instructional styles usedacross different course levels (i.e., 1000 level courses vs.
4000 level courses), concluding that similar
COLLEGE TEACHING
engagements are used across the curriculum (Stains
et al. 2018). Stains et al. describe three groups of classroom instruction/engagement profiles based on
COPUS data using analyses of instructor and student
behavior, categorized into three different profile types:
didactic instruction, in which lecture comprises 80% or
more of class time; interactive lecture, in which instructors supplement lecture with more student-centered
activities such as group activities and clicker questions
with group work; and student-centered instruction, in
which “instructors incorporated student-centered strategies into large portions of their classes” (Stains et al.
2018, 1469). When these three profile types were examined for differences across upper and lower division
courses, no significant different was found by Stains
et al (v2 (8, N ¼ 1927) ¼ 11.0, p ¼ 0.20). Taken
together, the findings of the current project coupled
with the results of Stains et al. indicate that course level
is not predictive of the type of student engagement
observed. This conclusion is in contrast with the perception of many faculty, who predicted that the student
engagement profiles of upper division courses would be
markedly different from lower division courses: at the
outset of our study, multiple faculty members predicted
that lower division courses would be characterized by
more frequent listening/processing, while upper division
courses would be characterized by more student-centered engagements such as discussion and problem solving.
Although Listening/Processing was the most consistently and frequently observed student engagement
across the courses in our study, the majority of student
engagement observed (>60% of the total time) was in
other engagement types. Using these benchmarks of
demarcation, the overall nature of the curriculum as a
whole could be described as student-centered, with individual class meetings varying along the spectrum from
didactic to highly student-centered. Our finding that
Listening and Discussion are central and predominant
behaviors in ALCs and other classroom settings
employing student-centered approaches, although seemingly paradoxical, is consistent with observational data
reported elsewhere (Cotner et al. 2013; Smith et al.
2013; Lane and Harris 2015; Stains et al. 2018). Given
the prevalence and importance of communication
between faculty and students, the classroom setting–a
highly verbal environment–places demands on instructors and students to listen well and speak informatively.
Classroom observation is a tool to present a mirror to
faculty regarding student behavior and invite faculty
reflection: “Are these forms of engagement helping students master the content and develop as individuals?”
157
Feedback and discussion between professional
developer and faculty
Providing feedback to faculty regarding the nature of
student engagement in their class sessions was a key
goal of the project. Despite our focus on descriptively
characterizing student engagement across the curriculum, the responses from faculty regarding the observation data from their class meetings provides insight
into how such data can be a powerful tool to elicit
reflection on pedagogical choices and inform such
decisions in the future. Such reflection can be elicited
on an individual instructor/course basis or on a larger
scale to encourage reflection about a course sequence
or the totality of a degree program. In the context of
our project, feedback sessions between the professional developer conducting the classroom observation
and the individual faculty member or team of faculty
members for the course illuminated consistent themes.
In particular, faculty responses to the observation data
could be examined relative to 1) intentions for eliciting student engagement for the session and 2) faculty
perceptions of the student engagement actually elicited
during the session. An additional but less prevalent
theme was a misalignment between faculty’s desired
student engagement and the type of engagement they
felt they were able to plan to implement.
When presented with the summary of student
engagement data following a class meeting (e.g.,
breakdown of the percent of time spent in categories
represented in a pie chart figure), instructors had
varying initial responses. In some cases, faculty perceptions of how students were engaged in their class –
and the relative time spent in those forms of engagement – closely matched the observational data.
Typically, these faculty had highly structured plans for
the class session and connected a clear pedagogical
rationale with the engagement strategies chosen.
Faculty cited a high value on the time spent face to
face in class with students and a desire to make the
most of the limited time together with intentionally
planned engagement formats.
In other cases, faculty perceptions of the engagement profile of the class session did not align with the
observational data. A focus on the “what” – content –
rather than the “how” – engagement – influenced
instructor choices. In addition, faculty cited limitations for being able to facilitate particular student
classroom engagement strategies, such as lack of student motivation to complete pre-class preparation
(i.e., a ‘flipped’ classroom approach) that would facilitate the use of more student-centered engagement
strategies in class.
158
K. J. METZGER AND D. LANGLEY
At times the faculty member’s intentions for which
types of student engagement behaviors would be elicited during a particular class session did not align
with what occurred in reality. In discussion of why a
disconnect between intentions and the observational
data may occur, faculty described perceiving a particular classroom activity as highly engaging (e.g., oral
presentations), but realizing in retrospect (prompted
by the observation data) that only a fraction of the
students were actually demonstrating that particular
highly engaging behavior at any given moment while
the majority of their classmates were listening. In
these cases, having the benefit of viewing the observational data of actual engagement facilitated the faculty’s ability to revisit intentions and reconsider if
such classroom practices supported their intended
learning goals.
Another reason cited for a mismatch between
instructor intentions and the observational data was
simply mismanaged time - faculty described an intention to limit the amount of lecture to prioritize time
spent in group problem solving, but ended up spending
more time lecturing than anticipated, which shortened
the amount of time for other activities. In cases where
observational data did not match faculty intentions in
spite of having a highly structured plan of engagement,
faculty attributed the discrepancy to the way(s) in which
students responded during the session. For example,
student misconceptions or questions that arise during a
class meeting may require more time than anticipated
during which the instructor is providing more didactic
instruction to clarify the student question. Since a hallmark of student-centered instruction is responding to
students in real time, adapting to the needs of the students informed by evidence gathered on the fly can be
an attribute of effective, just-in-time instruction rather
than a shortcoming of not adhering to an a priori plan.
For circumstances in which faculty were involved
in co-teaching, the observational data provided evidence that the teaching team could use to discuss
their intentions for student engagement, the reality of
what was being observed in particular sessions, and
how they could use the information to more effectively engage in co-planning. This was particularly
important for teaching teams in which the individuals
comprising the team tended to very different engagement profiles. While a diversity of engagement strategies was universally seen as positive, faculty also
expressed the desire to provide a consistent message
to students about the student-centered nature of the
course(es) that would be reflected in their student
engagement strategies.
Discussion
This classroom observation data project, and others,
are well situated within a larger framework of change
in the paradigm of higher education from a place that
delivers instruction to a place that produces learning
(Barr and Tagg 1995). Contrary to widely held perceptions that university faculty engage in rigid patterns of
largely instructor-centered teaching behaviors that do
not change over time, recent work has demonstrated
that changes in pedagogy are pervasive (Beyer, Taylor,
and Gillmore 2013, 9). Moreover, the pedagogical
changes that faculty make are most often the result of
what faculty members observe from their students’
behaviors or performance in their courses, rather than
from outside directives or mandates (Beyer, Taylor,
and Gillmore 2013, 10). In other words, many faculty
in higher education are in the process of shifting from
teaching using instructor-centered approaches to
teaching in ways that are responsive to their students
(i.e., student-centered approaches) as a result of
insight gleaned from what students do.
Thus, classroom observation data with a focus on
student engagement can be a useful, non-threatening
lever to promote faculty reflection on course design
and how teaching practices invite, promote, facilitate,
and sustain student behavior. Using systematic observations of student classroom engagement can provide
units or programs with evidence to inform and shape a
teaching culture focused on student behaviors, and in
which instructors can reflect thoughtfully on what is
occurring and specific changes to pedagogy to support
student learning and development goals of the unit.
Given that the observations in this study took place
exclusively in ALC contexts, we can also infer some
general conclusions about the impact of classroom
environment on teaching and learning practices. The
variability of student engagement behaviors observed
within and between courses across the BSHS curriculum demonstrates that ALC environments do not
result in consistent behavioral outcomes, but rather
than such spaces can invite and facilitate a variety of
teaching and learning behaviors. From our results, it
is apparent that different instructors can use the same
physical spaces and affordances in very different ways;
different instructional approaches chosen can in turn
impact the effectiveness of the instruction in such
spaces (Lasry, Charles, and Whittaker 2014).
It has been convincingly and repeatedly demonstrated, however, that actively engaging students in
well-designed learning experiences can lead to significant positive learning and development (c.f. Freeman
et al. 2014). Thus, the task facing educators in higher
COLLEGE TEACHING
education is not whether or not to pursue studentcentered pedagogies – regardless of specific classroom
environment affordances – but rather to determine
which student-centered approaches are the most
effective for a given context (e.g., for a particular disciplinary learning objective, distinct population of
learners, etc.). Faculty teaching in ALCs, or striving to
incorporate more student-centered pedagogies in traditional learning spaces, need iterative support in the
form of initial training, practice, feedback, and reflection to effectively facilitate effective student-centered
learning experiences. While ALCs invite many kinds
of student-centered engagement simply through their
physical design and technological amenities, the room
itself is not enough (Baepler et al. 2016, 71)
Institutions that wish to promote student-centered
classroom engagement can communicate institutional
support for such high impact practices by investing
financially in ALCs or by retrofitting traditional classroom spaces as mock-ALCs (Soneral and Wyse 2017),
and by incorporating an explicit focus on effective student-centered teaching in evaluation procedures. To best
facilitate effective use of ALCs and student-centered
approaches, faculty development and support for
reformed teaching should be provided systemically.
Institutions or units in the dreaming or planning stages
for acquiring active learning classrooms would do well to
consider in what ways they can support faculty to promote effective student engagement and learning gains.
Disclosure statement
No potential
the authors.
conflict
of
interest
was
reported
by
Funding
This
work
was
Minnesota Rochester.
supported
by
University
of
ORCID
Kelsey J. Metzger
http://orcid.org/0000-0001-6927-7786
References
Baepler, P., J. D. Walker, D. C. Brooks, K. Saichaie, 2016.
and, and C. Petersen. A Guide to Teaching in the Active
Learning Classroom: History, Research, and Practice.
Sterling, VA: Stylus.
Barr, R. B., and J. T. Tagg. 1995. “From Teaching to
Learning: A New Paradigm for Undergraduate
159
Education.” Change: The Magazine of Higher Learning 27
(6):12–26. doi: 10.1080/00091383.1995.10544672.
Beichner, R. 2014. “History and Evolution of Active
Learning Spaces.” New Directions for Teaching and
Learning 2014 (137):9–16. doi: 10.1002/tl.20081.
Beyer, C., E. Taylor, and G. M. Gillmore. 2013. Inside the
Undergraduate Teaching Experience: The University of
Washington’s Growth in Faculty Teaching Study. Albany,
NY: State University of Albany Press.
Bonwell, C. C., and J. A. Eison. 1991. “Active Learning:
Creating Excitement in the Classroom.” ASHE-ERIC
Higher Education Report. Washington DC: School of
Education
and
Human
Development,
George
Washington University.
Chickering, A. W., and Z. F. Gamson. 1987. “Seven
Principles for Good Practice in Undergraduate
Education.” AAHE Bulletin 39 (7):3–7.
Cotner, S., J. Loper, J. D. Walker, and D. C. Brooks. 2013.
“It’s Not You, It’s the Room”: Are the High-Tech, Active
Learning Classrooms Worth It?” Journal of College
Science Teaching 42 (6):82–8. doi: 10.2505/4/jcst13_042_
06_82.
Cross, K. P. 1993. “Reaction to ‘Enhancing the Productivity
of Learning’ by D. B. Johnstone.” AAHE Bulletin 46 (4):7.
Dewey, J. 1933. How We Think: A Restatement of Reflective
Thinking to the Educative Process. Boston: D. C. Heath.
(Original work published in 1910).
Eddy, S. L., and K. A. Hogan. 2014. “Getting under the
Hood: How and for Whom Does Increasing Course
Structure Work?” CBE Life Sciences Education 13 (3):
453–68. doi: 10.1187/cbe.14-03-0050.
EDUCAUSE Center for Analysis and Research. 2017.
“Higher Education’s Top 10 Strategic Technologies for
2017”.
https://library.educause.edu/resources/2017/1/
higher-educations-top-10-strategic-technologies-for-2017.
Freeman, S., S. L. Eddy, M. McDonough, M. K. Smith, N.
Okoroafor, H. Jordt, and M. P. Wenderoth. 2014. “Active
Learning Increases Student Performance in Science,
Engineering, and Mathematics.” Proceedings of the
National Academy of Sciences of the United States of
America 111 (23):8410–5. doi: 10.1073/pnas.1319030111.
Gaffney, J. D. H., E. Richards, M. B. Kustusch, L. Ding, and
R. Beichner. 2008. “Scaling up Educational Reform.”
Journal of College Science Teaching 37:48–53.
Hora, M., A. Oleson, and J. Ferrare. 2012. Teaching
Dimensions Observation Protocol User’s Manual.
Madison, WI: University of Wisconsin Madison;
Wisconsin Center for Education Research.
Kilgo, C. A., J. K. Ezell Sheets, and E. T. Pascarella. 2015.
“The Link between High-Impact Practices and Student
Learning: Some Longitudinal Evidence.” Higher
Education 69 (4):509–25. doi: 10.1007/s10734-014-9788-z.
Kuh, G., K. O’Donnell, and C. G. Schneider. 2017. “HIPs at
Ten.” Change: The Magazine of Higher Learning 49 (5):
8–16. doi: 10.1080/00091383.2017.1366805.
Kuh, G. D. 2008. “High-Impact Educational Practices: What
They Are, Who Has Access to Them, and Why They
Matter.” Change: The Magazine of Higher Learning. https://
provost.tufts.edu/celt/files/High-Impact-Ed-Practices1.pdf.
Lane, E. S., and S. E. Harris. 2015. “A New Tool for
Measuring Student Behavioral Engagement in Large
160
K. J. METZGER AND D. LANGLEY
University Classes.” Journal of College Science Teaching
44 (6):83–91. doi: 10.2505/4/jcst15_044_06_83.
Lasry, N., E. Charles, and C. Whittaker. 2014. “When
Teacher-Centered Instructors Are Assigned to StudentCentered Classrooms.” Physics Education 10:010116.
Lorenzo, M., C. H. Crouch, and E. Mazur. 2006. “Reducing
the Gender Gap in the Physics Classroom (Author
Abstract).” American Journal of Physics 74 (2):118–22.
doi: 10.1119/1.2162549.
Metzger, Kelsey J. 2015. “Collaborative Teaching Practices
in Undergraduate Active Learning Classrooms: A Report
of Faculty Team Teaching Models and Student
Reflections from Two Biology Courses.” Bioscene 41 (1):
3–9.
Pascarella, E., and P. Terenzini. 2005. How College Affects
Students. Volume 2, A Third Decade of Research. 1st ed.
San Francisco, Calif.: Jossey-Bass.
Smith, M. K., F. H. Jones, S. L. Gilbert, and C. E. Wieman.
2013. “The Classroom Observation Protocol for
Undergraduate STEM (COPUS): A New Instrument to
Characterize University STEM Classroom Practices.” CBE
Life Sciences Education 12 (4):618–27. doi: 10.1187/cbe.
13-08-0154.
Soneral, P., and S. Wyse. 2017. “A SCALE-UP Mock-Up:
Comparison of Student Learning Gains in High- and
Low-Tech Active-Learning Environments.” CBE—Life
Sciences Education 16 (1):ar12. doi: 10.1187/cbe.16-070228.
Stains, M., J. Harshman, M. K. Barker, S. V. Chasteen, R.
Cole, S. E. DeChenne-Peters, M. K. Eagan, J. M. Esson,
J. K. Knight, F. A. Laski, et al. 2018. “Anatomy of STEM
Teaching in North American Universities.” Science (New
York, NY) 359 (6383):1468–70. doi: 10.1126/science.
aap8892.
Stoltzfus, J. R., and J. Libarkin. 2016. “Does the Room
Matter? Active Learning in Traditional and Enhanced
Lecture Spaces.” CBE—Life Sciences Education 15 (4):
ar68. doi: 10.1187/cbe.16-03-0126.
Svinicki, M., and W. McKeachie. 2014. McKeachie’s
Teaching Tips: Strategies, Research, and Theory for College
and University Teachers. 14th ed. Belmont, CA:
Wadsworth.
Thaler, R., and C. Sunstein. 2009. Nudge: Improving
Decisions about Health, Wealth, and Happiness. New
York: Penguin Books.
Walker, J. D., and P. Baepler. 2017. “Measuring Social
Relations in New Classroom Spaces: Development and
Validation of the Social Context and Learning
Environments (SCALE) Survey.” Journal of Learning
Spaces 6 (3):34–41.
Whiteside, A., D. C. Brooks, and J. D. Walker. 2010.
“Making the Case for Space: Three Years of Empirical
Research on Learning Environments.” Educause
Quarterly, 33. https://er.educause.edu/articles/2010/9/making-the-case-for-space-three-years-of-empirical-research-onlearning-environments.
Wilson, D. 2015. “Learning Spaces Week at Harvard.”
Harvard Initiative for Learning & Teaching. https://hilt.
harvard.edu/teaching-learning-resources/learning-spacesweek-at-harvard/