Online Learning Literature Review
Online Learning Literature Review
Online Learning Literature Review
Reed College
ONLINE LEARNING LITERATURE REVIEW
September 2012 – Bao Quoc Phan
EFFECTIVENESS & OUTCOMES
In 2009, the US Department of Education1 produced a report of a meta-‐analysis (of
experimental results published between 1996 and 2008), conducted by SRI International2
under contract, titled Evaluation of Evidence-‐Based Practices in Online Learning: A Meta-‐
Analysis and Review of Online Learning Studies (revised in 2010, referred to as the DOE/SRI
Intl. report from this point). The authors concluded that online education is modestly
superior to traditional education in terms of student outcomes; however, the
improvements were mostly realized in study contexts that blended online instruction with
in-‐person contact, with additional instructional elements that might have been responsible
for the improvements, not the medium of instruction by itself (Means, Toyama, Murphy,
Bakia, Jones, 2010, p. xviii (Executive Summary, Conclusions), & p. 51).
Nonetheless, certain media outlets seized on this opportunity to praise the merits of web-‐
based learning and advocated for the expansion of its implementation3. In response, three
other research organizations—the National Bureau of Economic Research (June 2010,
Cambridge, MA), the Community College Research Center at Teachers College–Columbia
University (July 2010, New York, NY), and Ithaka S+R4 (May 18, 2012, New York, NY)—
published a number of papers to critique the meta-‐analysis report, built upon the current
experimental research body, complemented the report with further analysis of more
recent studies, and gave recommendations for future research. Even though the authors
disagreed about certain points, they have agreed on two main findings:
1
Specifically the Policy and Program Studies Service, of the Office of Planning, Evaluation, and
Policy Development.
2
See discussion in comments on Means et al., pg. 4 of this report.
3
As cited in Jaggars & Bailey, 2010, p. 2: “… popular media discussions of the findings (e.g., Lohr,
2009; Lamb, 2009; Stern, 2009) focused on the report’s seemingly clear-‐cut generalization that ‘on
average, students in online learning conditions performed better than those in face-‐to-‐face courses’
(U.S Department of Education, Office of Planning, Evaluation, and Policy Development, 2009, p. ix)”
4
See discussion in comments on Bowen & Lack, pg. 11.
Page 1
Figlio, Rush, Yin, 2010, p. 21). The medium of instruction by itself appears to have
very little impact on student outcomes. Positive effects observed in hybrid
situations (one that blends online with face-‐to-‐face instruction) may have been
attributed to other curricular features and pedagogical practices (Means et al., p.
37–49, 51–53)
2) That the current status of research in online learning is very weak: the
field has produced very few rigorously conducted experimental studies, while
even fewer of them were both carefully designed and tested in relevant contexts.5
This shows that institutional decisions to move classes online have been motivated
mostly by economic concerns, and not based on evidence of improved student
outcomes.6 More extensive and rigorous research is needed to conclusively prove
that online instruction is superior, inferior, or similar to traditional instruction in
teaching effectiveness, and to guide policy making at the postsecondary level.
Well-‐designed online courses can be effective. Ithaka S+R particularly advocate so-‐
called Interactive Learning Online systems (examples of which include courses developed
for use in Carnegie Mellon University’s Open Learning Initiative), which can provide
instant feedback and dynamically adapt the course to individual student’s needs. A quote
from the authors of the study, Interactive Learning Online at Public Universities: Evidence
from Randomized Trials (Bowen, Chingos, Lack, Nygren, May 2012), explains this idea best:
“By ‘ILO’ we refer to highly sophisticated, interactive online courses in which machine-‐ guided
instruction can substitute for some (though not usually all) traditional, face-‐to-‐face
instruction. Course systems of this type take advantage of data collected from large numbers
of students in order to offer each student customized instruction, as well as allow
instructors to track students’ progress in detail so that they can provide their students with
more targeted and effective guidance.” (pg. 9) As long as the courses are student-‐centered,
professors still hold authority and are responsible for guiding and interacting with
students, there are reasons to be cautiously optimistic about these learning systems, which
can potentially demonstrate tangible improvement over large lecture-‐based introductory
courses in learner–teacher and learner–learner interaction.
5
See discussion in comments on Jaggars & Bailey (pg. 7), and Figlio et al (pg. 9 of this report).
Jaggars & Bailey assert that only 7 studies were conducted in typical college course settings, while
Figlio argues that none of those studies is rigorous enough.
6
Figlio et al., pg. 2. Jaggars & Bailey does assert that student access is one of the top reasons to
consider increases in online course offerings (pg. 2); a quick look at Figlio et al., Bowen & Lack (pg.
11–12), and Bowen et al. (pg. 5–8, and Appendix B, pg. 37–42), however, reveal a considerable focus
on the economic side.
intriguing proposition. From Reed’s perspective, at the moment due diligence is still
needed in evaluating credit transfer for online courses; nonetheless, at the very least there
is no evidence to suggest that students are worse off by taking an online course in place of an
on-‐campus course provided by the same college or university.
REFERENCES
Means, B., Toyama, Y., Murphy, R., Marianne, B., Karla, J. (2009, rev. 2010). Evaluation of
Evidence-‐Based Practices in Online Learning: A Meta-‐Analysis and Review of
Online Learning Studies. US Department of Education, Office of Planning,
Evaluation, and Policy Development. Washington, DC. Retrieved from
http://ww w2.ed.gov/rschstat/eval/tech/evidence-‐based-‐practices/f inalreport.pdf.
Jaggars, S. S., & Bailey, T. (2010). Effectiveness of Fully Online Courses for College
Students: Response to a Department of Education Meta-‐Analysis. Community
College Research Center (CCRC), Teachers College, Columbia University. New
York, NY. Retrieved from http://ccrc.tc.columbia.edu/Publication.asp?UID=796.
Figlio, D. N., Rush, M., Yin, L. (2010). Is it Live or is it Internet? Experimental Estimates of
the Effects of Online Instruction on Student Learning. Working Paper 16089.
National Bureau of Economic Research (NBER). Cambridge, MA. Retrieved from
http://www.nber.org/papers/w16089.pdf.
Bowen, W. G., & Lack, K. A. (2012). Current Status of Research on Online Learning in
Postsecondary Education. Ithaka S+R, ITHAKA. New York, NY. Retrieved from
http://www.sr.ithaka.org/sites/default/files/reports/ithaka-‐sr-‐online-‐learning-‐
postsecondary-‐education-‐may2012.pdf.
Bowen, W. G., Chingos, M. M., Lack, K. A., Nygren, T. I. (2012). Interactive Learning
Online at Public Universities: Evidence from Randomized Trials. Ithaka S+R,
ITHAKA. New York, NY. Retrieved from
http://www.sr.ithaka.org/sites/default/files/reports/sr-‐ithaka-‐interactive-‐learning-‐
online-‐at-‐public-‐universities.pdf.
1) EVALUATION OF EVIDENCE-BASED PRACTICES IN ONLINE LEARNING
Barbara Means, Yukie Toyama, Robert Murphy, Marianne Bakia, and Karla Jones
US Department of Education & SRI International
Published June 2009
Revised September 2010
http://www2.ed.gov/rschstat/eval/tech/evidence-‐based-‐practices/f inalreport.pdf
ABSTRACT
A systematic search of the research literature from 1996 through July 2008
identified more than a thousand empirical studies of online learning. Analysts
screened these studies to find those that:
COMMENT
The authors found 1,132 studies published between 1996 and 2008 that pertain to
online learning (pg. 11). However, of these 1,132 empirical studies, only 176 were
conducted using quasi-‐experimental (non-‐randomized, with or without statistical
control) or randomized experimental design. Furthermore, only 99 out of 176
studies actually compare and contrast internet-‐based (either fully online or a
combination of online and face-‐to-‐face instruction, referred to as a “hybrid” or
“blended” approach) to in-‐person instruction, while a mere 45 of them contain
sufficient data to extract for the meta-‐analysis (pg. 11–14).
For such a fast growing field as online education, the research body of its
effectiveness is very thin and lacking in rigor. If we consider only studies
conducted in full-‐term undergraduate or graduate credit-‐bearing course settings,
the number of studies is reduced even further to 7 (Jaggars and Bailey, 2010, pg.
4). Ithaka S+R researchers exclaimed in their literature review, “Current Status of
Research on Online Learning in Postsecondary Education (May 2012, pg. ),” that
“sadly, this rapidly growing field has been built largely on the proverbial ‘wing and
a prayer.’”
Despite the weakness of the research corpus, we should not dismiss the findings
of the DOE/SRI Intl. meta-‐analysis. A broad and thorough reading of the report
(not just focused on the narrow generalization—that internet-‐based instruction is
more effective than in-‐person instruction—since most of the studies are not
representative of typical college courses) provides insights into good pedagogical
practices that might account for improved student outcomes in either traditional
or online learning. For example, students in blended learning environment
(online instruction combined with some face-‐to-‐face interaction) often receive
additional learning time and resources (pg. 29, 51), such as individualized course
elements (pg. 44). Most effective among those course features is the “inclusion of
mechanisms to prompt students to reflect on their level of understanding as they
are learning online (pg. 44–45, 48),” which reinforces the notion that critical and
reflective learning is most effective, even in an online environment.
What does this report imply for acceptance of online course credits at Reed? I do
not believe there is a short and straight answer. What we can draw from it is that
well-‐designed internet-‐based courses can be at least as effective as their in-‐person
counterparts. And since not all online courses are well-‐designed, caution is still
necessary in evaluating them on a case-‐by-‐case basis. At the minimum, there is no
hard evidence to suggest that courses conducted entirely online are, by virtue of
the medium, significantly superior or inferior to courses conducted on a college
campus. Since we do thoughtfully evaluate courses from other accredited
institutions for transfer credit, it is not out of the question that we apply a similar
level of scrutiny to evaluating internet-‐based credit-‐bearing courses conducted by the
same institutions.
SRI International is a contract research firm that focuses in areas of technology. 7
We may assume that the company has interest in promoting the use of technology;
however, the analysis appears to be balanced and not heavily biased, despite the
misleading media coverage that ensued.
7
From Wikipedia (http://en.wikipedia.org/wiki/SRI_International):
SRI International (SRI), founded as Stanford Research Institute, is a nonprofit research
institute headquarted (sic) in Menlo Park, California. The trustees of Stanford University
established SRI in 1946 as a center of innovation to support economic development in the
region. SRI is now one of the largest contract research institutes in the world.
[…]
SRI's focus areas include biomedical sciences, chemistry and materials, computing, Earth and
space systems, economic development, education and learning, energy and environmental
technology, security and national defense, as well as sensing and devices.
See more at http://www.sri.com/about
2) EFFECTIVENESS OF FULLY ONLINE COURSES FOR COLLEGE STUDENTS:
RESPONSE TO A DEPARTMENT OF EDUCATION META-ANALYSIS
Shanna Smith Jaggars and Thomas Bailey
Community College Research Center, Teachers College, Columbia University
July 2010
http://ccrc.tc.columbia.edu/Publication.asp?UID=796
ABSTRACT
COMMENT
The authors of this paper give balanced and thoughtful criticism about the DOE
meta-‐analysis. Of the 45 studies cited in the DOE report, 28 studies compare fully
online with in-‐person course sections, and only 7 of them were conducted with
undergraduate or graduate students in term-‐length courses (pg. 4). That is a very
small number of rigorous and relevant studies in a relatively large body of
research, one that has produced mixed results regarding the efficacy of online
learning.
Access may be part of the reason Reed students would want to take an
online course. If a Reed student would like to work in the summer and
take college courses to complete a group requirement, it may be
prohibitively expensive to take those courses on a college campus
(taking into consideration the lost wages from having to work fewer
hours; in addition, if the student is an international citizen or is
working out-‐of-‐state, they have to pay the non-‐resident surcharge).
However, if the student could take a course online, provided that they
would achieve similar outcomes, they could shop around for courses
that fit their budget (which may cost as little as $157/credit hour, such
as those provided by Brigham Young University—see
http://is.byu.edu/site/courses/tuition.cfm ), could have much more
flexibility with their schedule, and would not have to cut as many hours
off of their work shifts to fit in the chosen courses. Another scenario:
an (international or domestic) student could work or volunteer in
another country, while taking an online course provided by Oregon
State or University of Massachusetts, if there is no alternative in that
country.
ABSTRACT
This paper presents the first experimental evidence on the effects of live versus
internet media of instruction. Students in a large introductory microeconomics
course at a major research university were randomly assigned to live lectures versus
watching these same lectures in an internet setting, where all other factors (e.g.,
instruction, supplemental materials) were the same. Counter to the conclusions
drawn by a recent U.S. Department of Education meta-‐analysis of non-‐ experimental
analyses of internet instruction in higher education, we find modest evidence that
live-‐only instruction dominates internet instruction. These results are particularly
strong for Hispanic students, male students, and lower-‐achieving students. We also
provide suggestions for future experimentation in other settings.
COMMENT
In this working paper, the authors criticize the DOE report, mostly on the basis of the
research that forms the foundation of the meta-‐analysis. They point out the lack of
rigor in the design of the studies, the irrelevance of the study context, and lack of
control for student characteristics:
To demonstrate the kind of rigorous design that they propose, the author
conducted an experiment in a Principles of Microeconomics class taught at a large
research university, attempting to isolate the effect of the media (online vs. in-‐
person) of delivery of course lectures. The experiment, however, could not escape
the problems that plagued previous studies in sample size and confounding, which
the authors attributed to the restrictions imposed by the university’s
Institutional
Review Board in recruiting participants (pg. 8–9), and the possibility of
contamination (for example, students in the in-‐person section could borrow a
friend’s account to get access to the online lectures, which are normally available to
all students regardless of which section they registered for (pg. 14)). The particular
comparison between in-‐person lectures with a particular mode of online video
presentation also appears narrow and not applicable to large varieties of online
courses currently available, which may include interactive and multimedia features,
and provide for more interaction between students via online threaded discussions.
In addition, the grading is based entirely on three exams, and there is no discussion
sessions (pg. 7). All together, these characteristics make the course incomparable to
many offerings elsewhere, leading to limited external validity.
In short, Figlio and his co-‐authors have provided accurate and useful criticisms of
the DOE meta-‐analysis, called for more extensive research, and provided one of such
attempts, albeit one that is flawed and not very relevant. In addition, we get a
chance to learn about how a large public university conducts an introductory course
(which in this case is an Introduction to Microeconomics course). Notable
characteristics of the course are:
That between 1600 and 2600 students register for the course each term
(pg. 6). The course is offered in both online and lecture formats. All
registered students in this course are allowed to access to the online
video lectures, regardless of their sectional status (pg. 7).
That the lecture section has 190 seats. Attendance is not enforced. A vast
majority of students, even those who registered for the lecture section,
ends up watching the lectures online (pg. 6–7).
That grading is based entirely on three multiple-‐choice exams. There is
neither any writing assignment nor in-‐class discussion (pg. 7).
4) CURRENT STATUS OF RESEARCH ON ONLINE LEARNING IN POSTSECONDARY
EDUCATION
William G. Bowen and Kelly A. Lack
Ithaka S+R, ITHAKA8
May 2012
http://www.sr.ithaka.org/sites/default/files/reports/ithaka-‐sr-‐online-‐learning-‐
postsecondary-‐education-‐may2012.pdf
ABSTRACT
The review yields little evidence to support broad claims that online or hybrid
learning is significantly more effective or significantly less effective than courses
taught in a face-‐to-‐face format. At the same time, it highlights the need for further
research on this topic, with particular attention paid to differences in outcomes
among different student populations and different sets of institutional users. The
importance of research of this kind will only grow as even more sophisticated,
interactive online systems continue to be developed, and the current budgetary
constraints and enrollment pressures on postsecondary institutions strengthen the
case for improving productivity.
COMMENT
8
From http://www.sr.ithaka.org/people/about-‐us:
Ithaka S+R is a research and consulting service that helps academic, cultural, and publishing
communities in making the transition to the digital environment. We pursue projects in
programmatic areas that are critical to the advancement of the academic community.
Ithaka S+R is part of ITHAKA, a not-‐for-‐profit organization that also includes JSTOR and Portico.
highlight feature of the document is a table summarizing all the studies cited in a
clear and concise format (pg. 14–15). Overall, it is a good supplement to the DOE
report and worth reading if we are looking at the overall state of research in online
education.
We should note that Ithaka S+R is a not-‐for-‐profit research organization that focuses
in technology in education, and their mission statement (see footnote 5) suggests
that they have a rather large stake in the advance of technology and web-‐based
instruction in higher education. In addition, Henry S. Bienen, chairman of ITHAKA
(the parent organization of Ithaka S+R) is both a President Emiritus of
Northwestern University and vice chairman of the board of directors of Rasmussen
College Inc., a for-‐profit postsecondary institution with a large catalog of online
offerings. Ithaka S+R papers appear to be balanced and thoughtfully crafted;
nonetheless, we should keep in mind that the organization may be advocating for
online education, and should pay attention to potential bias.
5) INTERACTIVE LEARNING ONLINE AT PUBLIC UNIVERSITIES: EVIDENCE FROM
RANDOMIZED TRIALS
William G. Bowen, Matthew M. Chingos, Kelly A. Lack, and Thomas I. Nygren
Ithaka S+R
May 2012
http://www.sr.ithaka.org/sites/default/files/reports/sr-‐ithaka-‐interactive-‐learning-‐online-‐ at-‐
public-‐universities.pdf
ABSTRACT
Online learning is quickly gaining in importance in U.S. higher education, but little
rigorous evidence exists as to its effect on student learning outcomes. In "Interactive
Learning Online at Public Universities: Evidence from Randomized Trials," we
measure the effect on learning outcomes of a prototypical interactive learning
online (ILO) statistics course by randomly assigning students on six public
university campuses to take the course in a hybrid format (with machine-‐guided
instruction accompanied by one hour of face-‐to-‐face instruction each week) or a
traditional format (as it is usually offered by their campus, typically with 3–4 hours
of face-‐to-‐face instruction each week).
We find that learning outcomes are essentially the same—that students in the
hybrid format "pay no price” for this mode of instruction in terms of pass rates, final
exam scores, and performance on a standardized assessment of statistical literacy.
These zero-‐difference coefficients are precisely estimated. We also conduct
speculative cost simulations and find that adopting hybrid models of instruction in
large introductory courses have the potential to significantly reduce instructor
compensation costs in the long run.
COMMENT
9
The authors also discuss the process of conducting the experiment, and the lessons they learned
from it, in Appendix C, page 43–52
The inclusion of a “control” (students taking the normal course) and a
“treatment” (students taking the redesigned, “hybrid” course) group
(pg. 2);
The large sample size (605 participants, out of a total of 3,046 course
enrollees, spread over 6 public institutions (pg. 11));
The bulk of discussion in this area is, however, not about how assessment is
conducted in online courses. Instead, it is mainly about how assessment SHOULD be
conducted in online courses. Therefore, the number of studies useful for our discussion is
rather limited. Moreover, the use of different assessment strategies is at the discretion of
the instructors, so it is hardly uniform (Liu, 2007; Kim et al., 2011).
REFERENCES
Vonderwell, S., Liang, X., Alderman, K. (2007). Asynchronous Discussions and Assessment
in Online Learning. Journal of Research on Technology in Education 39(3), 309-‐
328. International Society for Technology in Education (ISTE). Eugene, OR.
http://www.eric.ed.gov/PDFS/EJ768879.pdf
Kim, N., Smith, M. J., Maeng, K. (2011). Assessment in Online Distance Education: A
Comparison of Three Online Programs at a University. Online Journal of Distance
Learning Administration (OJDLA).
http://www.westga.edu/~distance/ojdla/spring111/kim111.html
Runyon, D., Holzen, R. V. (2003). Effective Assessment Techniques for Online Courses.
EDUCAUSE. Presentation.
http://www.educause.edu/ir/library/powerpoint/EDU03150.pps
1) ASSESSING ONLINE ASYNCHRONOUS DISCUSSION IN ONLINE COURSES: AN
EMPIRICAL STUDY
Shijuan Liu
Department of Instructional Systems Technology, Indiana University
2007
Refereed conference proceeding, The Teaching Colleges and Community Worldwide Online
Conference (TCC)10
http://tcc.kcc.hawaii.edu/previous/TCC%202007/liu.pdf
ABSTRACT
10
From TCCHawaii.org/about
The Teaching Colleges and Community Worldwide Online Conference (TCC) is a virtual
conference held annually online. This event is designed for faculty, staff and administrators
in higher education worldwide to share their expertise and engage in productive forums
about innovations and practices that accompany the use of technology in teaching and
learning.
2) THE EFFECTIVENESS AND DEVELOPMENT OF ONLINE DISCUSSIONS
ABSTRACT
Selma Vonderwell (Cleveland State University), Xin Liang and Kay Alderman (The
University
of Akron)
Spring 2007
Hosted on ERIC (Educational Resources Information Center)
http://www.eric.ed.gov/PDFS/EJ768879.pdf
ABSTRACT
ABSTRACT
The purpose of this study was to investigate whether or not the principles of
assessment in online education are reflected in the assessment activities used by
the developers and administrators of actual online distance courses. Three online
distance education programs provided at a large mid-‐west university were
analyzed; the School of Continuing Studies – undergraduate distance program, the
School of Business – distance MBA program, and the School of Education –
distance graduate program. The results of the study showed that the assessment
activities of online distance courses do not strictly follow the principles suggested
in the literature.
5) THE ROLE OF CRITICAL THINKING IN THE ONLINE LEARNING ENVIRONMENT
Kelly Bruning
International Journal of Institutional Technology and Distance
Learning May 2005
http://www.itdl.org/Journal/May_05/article03.htm
Note: The topic in this article is expanded in the following issue (June 2005) of this journal,
included in the reading binder.
ABSTRACT
Darla Runyon (Northwest Missouri State University), Roger Von Holzen (Northwest
Missouri State University)
Presentation, Jan 2003
From EDUCAUSE
http://www.educause.edu/ir/library/powerpoint/EDU03150.pps
ABSTRACT
One question from faculty members teaching an online course for the first time is,
"How do you do online exams?" This presentation will provide participants with a
wide range of practical examples of effective assessment techniques that may be
employed across a variety of online course subject areas.