Sage Publications, Inc., American Educational Research Association Review of Educational Research
Sage Publications, Inc., American Educational Research Association Review of Educational Research
Sage Publications, Inc., American Educational Research Association Review of Educational Research
REFERENCES
Linked references are available on JSTOR for this article:
http://www.jstor.org/stable/3700584?seq=1&cid=pdf-reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted
digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about
JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
http://about.jstor.org/terms
Sage Publications, Inc., American Educational Research Association are collaborating with
JSTOR to digitize, preserve and extend access to Review of Educational Research
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Review of Educational Research
Spring 2006, Vol. 76, No. 1, pp. 93-135
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
We completed our search for literature in two stages. First, we reviewed litera-
ture that we found in electronic databases using online course descriptors. In the
second stage we searched for articles cited in some of the articles that we had read
and searched Internet journals and tables of contents of journals. We used ERIC,
PsycINFO, ContentFirst, Education Abstracts, and WilsonSelect. Our search
descriptors included online course and instruction, cyberspace course, computer-
based course/instruction/learning, distance education, e-learning, online teaching,
web-based teaching, Internet/teaching/instruction, computer assisted instruction,
computer software instruction, telecourses, instructional technology in education,
virtual learning, and distributed learning. We reviewed 91 articles and deleted 15
of these. They were discarded because they were about general distance education
and not online courses in particular. The Appendix includes the authors, date of
publication, method, purpose of the study, participants, and the research method
for each article reviewed and, as well, the aspect (course environment, learners'
outcomes, learners' characteristics, and institutional and administrative factors) of
our review for which each study was used. All of the 76 studies cited in the Appen-
dix are included in this literature review.
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
somewhat exploratory, whereas the more recent quantitative studies were more
experimental and causal-comparative in design.
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
Aspects Examined
During our initial review of the literature, we began to note aspects of online
courses under study. We organized this list following traditional curriculum and
instructional design theories and processes (e.g., Anderson & Krathwohl, 2001;
Beck, McKeown, Worthy, Sandora, & Kucan, 1996; Linn & Gronlund, 2000; Per-
rone, 1994; Taba, 1967; Tyler, 1950). We ultimately chose four major themes to
define our literature review. These included course environment, learners' outcomes,
learners' characteristics, and institutional and administrative factors.
Course Environment
Classroom Culture
Some researchers studied the ways that online courses managed classroom inter-
actions. Ahern and El Hindi (2000) found that asynchronous discourse mimicked the
dynamics of real-time, multivoiced discussions. Their IdeaWeb format was trans-
parent in peer-to-peer discourse, allowing self-management of discussions by stu-
dents without constraints or proprietary rules. Conversely, Kanuka and Anderson
(1998), in their mixed-methods research with online forums (computer-mediated
conferences), found a lack of fluidity and conversational language. They raised
concerns about inconsistent and unchallenged ideas, concluding that these online
interactions provided little negotiated meaning or new knowledge construction.
Davidson-Shivers, Tanner, and Muilenburg (2000) compared the substantive qual-
ity of synchronous and asynchronous discourse to determine whether one discus-
sion environment produced more content-related participation than the others. They
found that chats provided a direct, immediate environment for responses, whereas
listserv responses were delayed but more focused and purposeful. These researchers
reasoned that in asynchronous discussions, students had more time to think about
their responses and that the increased thinking time improved the depth and quality
of responses.
A few researchers referred to online students as a community of learners.
Winograd (2000) explored the effect of a moderator on online conferences, devel-
oping a theory that even a low degree of moderation allowed a group to form a
community, as determined by the elements of camaraderie, support, and warmth.
Knupfer, Gram, and Larsen (1997) surveyed faculty members from four universi-
96
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
ties to learn about graduate students' reactions to online discussions. Their results
suggested that it is important to establish a community of learners. Research-based
suggestions included establishing study groups early, modeling and reinforcing
effective communication, identifying potential problems, and designing a plan for
dealing with these potential problems. Bielman, Putney, and Strudler (2000) also
reviewed the construction of an online community. They noted that learners
included emoticons, such as smiley faces, in their online communications with one
another in an effort to compensate for the missing visual and nonverbal communi-
cation cues.
Certainly, the research regarding online environment points to the importance
of learner-focused course design. Knupfer, Gram, and Larsen's (1997) study of
graduate students' reactions to online discussions found that instructors (although
they believed that organization, collaboration, and flexibility were key components
of online discussions) failed to recognize the importance of the students' feelings,
reactions, and responses.
The online environment may offer a unique social advantage as compared to the
traditional classroom. In Sullivan's (2002) research, 42% of the females surveyed
commented on the advantage of anonymity in a networked learning environment.
However, Brown and Liedholm (2002), in their comparison of face-to-face, online,
and hybrid microeconomics course students, found that female students scored sig-
nificantly (5.7 percentage points) lower than males in the online course though,
there was no significant difference in the learning outcomes between the online and
hybrid formats.
Structural Assistance
Researchers exploring this aspect of online instruction were interested in how
online course designs such as scaffolds and management systems might guide or
assist student learning. Christel (1994) developed an experimental comparison of
multiple versions of an interactive, digital video course. Students had control at all
times over the virtual rooms in which they would work. These were the auditorium,
training room, library, office, or conference room. In this virtual environment,
Christel found that motion-video-interface enhanced recall better than still slides,
and recommended that pedagogically important course information should be pre-
sented via video. Mayer, Heiser, and Lonn (2001) found that concurrent narration
and animation had a redundancy effect that caused learners to split their visual
attention and lower their transfer performance. It seems that when the cognitive
load is high, understanding of complex concepts is hindered.
Greene and Land (2000) explored instructional scaffolds to support cognitive func-
tioning. They found that guiding questions (professor-developed, procedural scaf-
folding) helped students to focus and develop their projects. These students needed
real-time, back-and-forth discussion with their instructors that helped them to better
understand their course projects and begin thoughtful consideration earlier. Student-
to-student interaction, specifically over shared prior experiences, influenced student's
ideas and encouraged them to expand, formalize, and refine their reasoning.
Some researchers noted that self-pacing was an important feature of online learn-
ing. Schrum (1995) completed formative research regarding the impact of pacing
on developing and presenting an online course. He found that students appreciated
being able to move through the course at their own pace. More successful students
97
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
moved through the course more quickly than less successful students. However,
Hantula's (1998) review of student evaluations from an organizational psychology
course determined that asynchronous course features required a higher degree of
self-management on the part of the students. Mayer and Chandler (2001) explored
the benefits of computer-user interactivity to determine the pace of the presentation.
Their results supported the cognitive load theory that a modest amount of inter-
activity promotes deeper learning from a multimedia explanation.
A few researchers tested the structural assistance of specific course features.
Cooper (1999) provided online resources and course materials in folders for each
week of the course. Cooper's research showed that students valued timely course
announcements, lecture notes, and chapter questions and answers. Bee and Usip
(1998) presented supplementary materials, tutorials, and general course informa-
tion online. They found that students who used these materials achieved improved
course performance and improved knowledge of cyberspace to a greater extent
than those who did not use the materials.
Success Factors
Some researchers were interested in how online courses guided student success
Edwards and Fritz (1997) determined that the effectiveness of online learning is
influenced by student access to material, recommending that online information
may replace the traditional text format for those students who accept and learn we
from the online format. Schrum & Hong (2001) completed a survey with 70 insti
tutions and found eight dimensions that affect student success. These included
access to tools, technology experience, learning preferences, study habits, goals,
purposes, lifestyles, and personal traits.
Faux and Black-Hughes (2000) compared traditional, online, and hybrid sections
of an undergraduate course in social work to determine the effectiveness of onlin
learning. Their results showed the most improvement (from pretest to posttest) for
students in the traditional, face-to-face section. Further, Faux and Black-Hughes
found that 41.7% of the students did not feel comfortable learning from the Inter-
net in their online course. Students wanted more instructor feedback and auditor
stimulation; they wanted to listen to, rather than read about, historical materia
Though this study was limited (with only 33 student participants), the results rai
concern regarding (a) course design according to instructor convenience rather than
student preferences, and (b) students' willingness to take responsibility for their own
learning. Brown and Liedholm (2002), in a similar comparative course study (wit
710 students), noted that performance differences might be attributed to difference
in student effort. Students in the face-to-face class spent 3 hours in class each week
while the online and hybrid course students reported spending less than 3 hours per
week on the course. Brown and Kulikowich's (2004) results, however, comparin
online and standard lecture course outcomes of graduate-level statistics students
indicated no significant differences in posttests according to group membership.
Trinidad and Pearson (2004) measured the learning preferences of the student
with the Online Learning Environments Survey (OLES) as developed by Fraser,
Fisher, and McRobbie to determine the effect of problem-based learning (PBL) i
an online course. They found that students' actual and preferred scores on the
OLES were closely matched and concluded that PBL provides a practical strategy
for online learning instruction. Other research by Pearson and Trinidad (2004) sug
98
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
gested that the OLES might be a helpful tool for educators to enhance online learn-
ing environments and to determine which course aspects should be considered for
revision.
Young (2004) examined the characteristics of outstanding online teachers in the
School for All. Young's analysis defined three major categories of online peda-
gogical models: the single-teacher model, the co-teacher model, and the cluster-
course model. These results provide evidence that online pedagogy allows alternative
instructional approaches, but the results do not prove which model worked more
effectively.
Interaction Systems
Researchers were also interested in the relevance of online interaction to learn-
ing tasks-that is, how much of the information that learners exchanged in the
online interaction was related to learning tasks in which they were involved.
Davidson-Shivers, Tanner, and Muilenburg (2000) recorded students' online dis-
cussions in two modes, synchronous (chat) and asynchronous (listserv). The com-
ments were coded into substantive categories (related to study topics and contents)
and nonsubstantive categories (not related to study topics and contents). Although
the frequency of the substantive category was higher than that of the nonsubstan-
tive category with both interaction modes, no inferential statistics were reported to
substantiate differences between the categories and the modes.
This study has raised an interesting question that warrants future research. To
address classroom teachers' concerns about how much class time is truly spent on
students' learning, researchers developed constructs of engaged time, or time on
task (Rosenshine, 1979). Of the yearly 1,000 hours of instruction in regular class-
rooms mandated by most states for elementary and secondary schools, only 300 to
400 hours are devoted to high-quality academic learning in which students are
engaged in learning activities. The rest of the school hours are spent on non-
academic activities, such as recess, lunch, transitions, and other off-task activities
(Weinstein & Mignano, 1997). Instructors in the online environment may face a
similar problem but in a different format. On the one hand, online instruction elim-
inates the needs for recess, lunch, and transition. Instructors in online courses do
not have to deal as much with discipline problems, which usually account for a big
proportion of off-task time. We could expect the ratio of engaged time in the online
environment to be greater than that observed in regular classrooms. On the other
hand chat room, e-mail, and online discussion provide students with the conve-
nience of social interaction in cyberspace, which is not available in the regular
classroom. Students could easily be distracted from their academic learning and
become involved in nonacademic interaction with their classmates. Furthermore,
as indicated in other sections of this review, students as well as instructors in online
classrooms often experience technical problems with courses via the Internet, espe-
cially those who are novices in computer technology. People devote a great amount
of teaching or study time to learning new skills that they must possess to be suc-
cessful in the online learning environment (Davidson-Shivers et al., 2000; Richards
& Ridley, 1997; Warschauer, 1998; Wells, 2000). Although the time spent on the
learning curve of computer-related skills is necessary, it may not be directly related
to learning of course content. The Davidson-Shivers et al. study indicated the
importance of the distinction between the substantive and nonsubstantive interaction
99
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
in the online environment. While some researchers use the frequency of logging
on to course websites or the length of logged-on time as measures of students'
engagement in online learning (Ahern & Durrington, 1995; Taraban, Maki, &
Rynearson, 1999), this study indicated that the quality, rather than the quantity,
of the time spent in online courses might be a more accurate index of students'
engagement. In any case, this issue has not received the attention it deserves from
researchers. In a promising development, a recent study (Daroszewski, 2004)
showed that educators were taking advantage of the technology to encourage stu-
dents to share learning experiences online. Nursing students posted online journals
weekly, discussing their clinical work experiences for two quarters, and were
required to read and comment on their classmates' journaling entries weekly. A
postevaluation of the practice showed that students perceived that sharing clinic
experiences enhanced their learning and promoted mentoring, critical thinking, and
socialization.
Other researchers (Kanuka & Anderson, 1998) were interested in the depth of
online interactions. Applying a model of mass communication, the researchers
contended that participants in a communication process construct knowledge
through a five-stage process. In Stage 1, participants share their information and
opinions. In Stage 2, participants discover and explore dissonance and inconsis-
tency in the information and opinions shared. In Stage 3, participants negotiate and
co-construct knowledge. In Stage 4, participants further test and modify newly
constructed knowledge. In Stage 5, the final stage, participants explicitly phrase
agreements, statements, and applications of new knowledge. With data obtained
from an online forum with 11 participants and coded into the stages, researchers
found that students' interactions in the online environment were primarily at the
lower levels of communication (sharing information and discovering dissonance)
and rarely developed into a higher level of communication where negotiation, co-
construction, and agreement occurred. Although the study was done with a small
sample of participants, it provided a promising model for future research on the
relationship between the quality of interaction and construction of knowledge.
The findings of Kanuka and Anderson (1998) were echoed by those of Thomas'
study (2002). Thomas examined undergraduate students' interactions in online dis-
cussions on two themes in an environmental studies course aligned with a five-level
taxonomy of cognitive engagement: prestructural, unistructural, multistructural,
relational, and extended abstract (Biggs & Collis, 1982). The researcher found that,
for both themes, students' cognitive engagement peaked at the multistructural
level, which was defined as "the learner picks up more and more relevant or cor-
rect features, but she does not integrate them" (Thomas, 2002, p. 255). Thomas
believed that factors such as unfamiliarity with the field and pressure to spend time
on learning activities other than online discussion contributed to the learners' lack
of high-level cognitive engagement.
This study revealed a problem with learning in the online environment. If learn-
ing is viewed from the Vygotskian perspective as a constructive or co-constructive
process (Vygotsky, 1978), the shallow level of participation shown by this study
to exist in a communication modality that shares and acknowledges only the dif-
ferences in participants' views is not sufficient to make construction or co-
construction possible. Students learn only when their current view of knowledge
is challenged, reformed, and synthesized through their interaction with others
100
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
(Vygotsky, 1978). However, this only occurs when students intensify their partic-
ipation in the online interaction upward to Stage 3, 4, or 5 in Kanuka and Anderson's
(1998) model.
We do not know exactly what causes the students' shallow participation in online
interaction. One possible explanation is the lack of instructors' guidance in online
chatting and discussion. Once again, Vygotsky provided possible solutions to
the problem. Contemporary Vygotskian theories emphasize the importance of
guided participation (Radziszewska & Rogoff, 1991; Rogoff, 1991). Instructors in
online courses, like their counterparts in regular classrooms, play a crucial role in
students' knowledge construction by scaffolding the learning process for them. If
instructors do not assume responsibility for guiding students' learning, their learn-
ing could be inefficient or ineffective. Research on discovery learning in traditional
classrooms could benefit online instruction, because it shows that when discovery
learning was conducted in a random and unstructured manner, students were more
likely to construct misunderstandings or wander in a time-consuming process of
investigation without arriving at conclusions (Hammer, 1997; Schauble, 1990).
When discovery learning activities were carefully planned and structured, students
were led to make correct interpretations of information and produce solutions to
problems presented to them (Hickey, 1997; Minstrell & Stimpson, 1996; White &
Frederiksen, 1998). Thus it is the online instructor's responsibility to organize online
interactions that are sufficiently structured to benefit students' learning.
Some researchers (McIssac, Blocher, Mahes, & Vrasidas, 1999) believed that cer-
tain characteristics of the online environment would enhance the interaction between
students and between students and their instructors. Believing that interaction is "the
single most important activity in a well-designed distance education experience,"
McIssac, Blocher, Mahes, and Brasidas (1999, p. 122) qualitatively examined
archived messages exchanged between doctoral students during chat time in six
Web-based courses and interviewed them after the courses to learn about their
experiences in the online interaction. The researchers found that students' positive
experiences during the interaction online could be promoted by the instructors'
providing prompt feedback, participating in the interaction, encouraging social
interaction, and employing collaborative learning strategies.
Ahern and Durrington (1995) manipulated two variables, anonymity of partic-
ipants (salient versus anonymous) and interface of online discourse (graphic-based
versus text-based) to explore whether different online communication tools and
formats influenced people's participation in online interaction. The interaction pat-
tern was operationally defined as frequency of visits, number of messages, num-
ber of words, and time spent. The authors found that when the communication was
anonymous, students were more likely to establish "highly structured" communi-
cation patterns, that is, to spend more time and write longer messages. They also
found that anonymity and the graphic-based interface enhanced students' engage-
ment in highly structured interpersonal interactions. When students could choose
to address their comments to either a group or an individual in the computer-
mediated discussion, students who used pen-names with the graphic-based inter-
face (allowing them to visually trace previous discourse) were significantly more
likely to choose individuals over groups as their audience.
Most of the literature reviewed on online interaction provided descriptions of the
various formats used, the instructors' experiences, and participants' reactions. Some
101
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
research studies evaluated the level of interactions and determined the critical
components of online interactions (Mikulecky, 1998). Though there are numerous
options available for online interactions, those described most often in the litera-
ture included e-mail, listservs, and chat.
Althaus (1997) conducted a study to examine whether supplementing a face-to-
face discussion with computer-mediated discussions would enhance academic per-
formance. Through a correlational study with 142 undergraduates, he pointed out
that online discussions, which do not usually occur in real time, avoid some of the
undesirable characteristics of face-to-face discussions in the classroom. Face-to-
face discussions must occur at the same time and place, and students bid against
each other for an opportunity to speak. This can create crowding or disruption to
the flow of discussion. In online discussion, students are able to log on and join the
discussion when it is convenient, and they have more time to read messages, reflect
on them, and compose thoughtful responses. He concluded that students who were
actively involved in the computer-mediated discussions earned higher grades than
other students.
Mikulecky (1998) compared class discussions in an online graduate course on
adolescent literature to those of a face-to-face version of the same course. There
were 22 graduate students in the online course and 18 students in the face-to-face
format. The face-to-face group included 7 graduate students and a mixture of 11 post-
baccalaureate and undergraduate seniors. Electronic interchanges were found to be
as helpful as face-to-face classes and were characterized by the following patterns:
(a) rich descriptive presentations of situations, dilemmas, and solutions; (b) detailed,
thoughtful responses and counterresponses to fellow students, including sugges-
tions for further professional development; (c) comments to link to one's own
experiences as well as to spur and synthesize new thoughts; (d) sharing of troubling
professional experiences and provision of support to others; and (e) occasional
debate. Graduate students seemed to benefit from the online, asynchronous discus-
sions. However, the lack of immediate feedback from the instructors allowed stu-
dents to procrastinate in entering their responses or to withdraw from the discussion.
Blignaut and Trollip (2003) noted the importance of instructor "presence" in an
online course and hypothesized that, in the online world, presence requires action.
They also analyzed faculty discussion postings by looking at postings across three
online business courses, and developed a taxonomy of instructor participation. They
defined differences between administrative, affective, corrective, informative, and
Socratic responses. Though they did not evaluate instructors' facilitation strategies,
the research results clearly point to broad differences in instructor participation
online, which may be attributed to differences in cognitive and teaching styles.
From the results of a mixed-method study with 110 undergraduate students,
Wilson and Whitelock (1998) concluded that the number of online interactions
needs to be kept relatively high in discussions, and some dramatic tension should
be created to motivate participation. They also suggested that involving students
in the process of getting to know each other also affects collaborative engagements.
On the other hand, Frey, Faul, and Yankelov (2003) found, in their assessment of
Web-assisted strategies, that students perceive e-mail communication with the
instructor as the most valuable strategy; these study participants did not value highly
the strategies designed to facilitate communication among students (creation of
home pages, accessible e-mail addresses, and discussion groups). Young (2004)
102
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
and Keefe (2003) found a high degree of interactivity and student participation to
be critical components of online instruction. Keefe's comparative study (2003)
found that students performed better and were more satisfied in the face-to-face
environment than in the online environment. Keefe suggested that the lack of inter-
action experience in the online section may have contributed to this difference.
However, no evidence was offered to support this conclusion.
Berge (1999) pointed out that the instructional design, rather than the delivery
system (e-mail, chat sessions, listserv, and the like), affects the quality of online
discussions and the learning that takes place. Instructors need to choose from the
systems available and select those that will best meet the instructional goals of
the course. Critical to understanding online interactions is to realize that they
involve a continuum from teacher-centered to student-centered participation. A
study of the use of a graphic interface program for online interaction, the IdeaWeb,
indicated that graduate students in a teacher certification program socially con-
structed their ideas about teaching and learning in a more peer-oriented discourse
without direction from the instructor (Ahern & El Hindi, 2000). Christopher,
Thomas, and Tallent-Runnels (2004) developed a rubric along the lines of the
Bloom taxonomy (Anderson & Krathwohl, 2001) to assess the thinking levels of
discussion prompts and responses. While these researchers found that unguided
discussions fell into the middle level (organize, classify, apply, compare, and con-
trast) of the taxonomy, they suggested that more direct guidance from the course
instructor might have encouraged development of higher levels of thinking in the
responses (synthesize and evaluate). This guidance might take the form of instruc-
tors adding information to the discussion and asking follow-up questions.
Im and Lee (2003/2004) conducted a comparison study of synchronous and
asynchronous discussion with 40 preservice students in an online university
course. They found that synchronous discussions were more useful for promoting
social interaction and asynchronous discussions were more useful for task-oriented
communication. Based on these results, Im and Lee suggested that synchronous
and asynchronous discussions should be used for different educational purposes in
online courses.
Hansen and Gladfelter (1996) concluded that online pedagogy comes naturally
to some instructors but may be perplexing to others. In their focus group study of
online seminar participants, they concluded that online instructors could not expect
to create a stimulating collaborative online learning environment while thinking
merely of textbook chapters and lectures. Rather, online instructors must create an
atmosphere of respect and safety so that informed debate and collaborative prob-
lem solving can flourish.
Evaluation System
Although this review of the literature did not reveal much discussion of evalu-
ation in online courses, it is an important issue to consider in online teaching and
learning. Managing student assignments, providing feedback to students, and
assessing students' learning are all key factors in any course, whether face-to-face
or online. While the online format presents some challenges to instructors, it also
may encourage the development of new learning and teaching techniques. Levin,
Levin, and Waddoups (1999) conducted a study of a Master of Education program
that is offered entirely online. Their mixed-method study included surveys sent to
103
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
the online students to ask them about their perceptions of the courses. Levin et al.
also developed case studies during the first three semesters of the program, fol-
lowing four of the online students in the program. Their findings suggest that
instead of being restricted to face-to-face learning environments for evaluation,
instructors could make use of the various options available for learning, teaching,
and assessment through innovative online education. In their recently developed
CTER (Curriculum, Technology, and Education Reform) online program, Levin
et al. have employed multiple assessment techniques, including assessment by
classmates and the professor, by other educators (fellow teachers, graduate stu-
dents, professors from other universities), and self-assessment.
Learners' Outcomes
Cognitive Domain
One question that people who are involved in online teaching and learning w
to answer is whether online instruction produces as much learning as traditio
instruction does. To answer the question, researchers have compared learners'
demic performance in online courses with academic performance in regular c
rooms. This research was done primarily in causal comparative (no manipula
104
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
105
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
106
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
of information and made it more personally relevant than regular classroom deliv-
ery. Students' self-regulation in learning was measured and categorized into three
levels and used as another independent variable. Students' declarative knowledge
was measured by a multiple-choice test, and procedural knowledge was measured
by 20 computer applications in an authentic situation. McManus found that an
advance organizer helped students when materials were presented with low or
medium levels of nonlinearity but had a detrimental effect on learning when infor-
mation was presented with a high level of nonlinearity. He also found an interac-
tion effect between self-regulation and nonlinearity. That is, with low levels of
nonlinearity of presentation, low and medium self-regulated learners performed bet-
ter than highly self-regulated learners. With medium levels of nonlinearity of pre-
sentation, the three types of self-regulated learners performed equally well. With
high linearity of presentation, low self-regulated learners did better than medium
and high self-regulated learners. It seems that the attribute-treatment-interaction
paradigm, such as the one used in the study by McManus, is a useful approach that
allows researchers to study how individual learner differences and characteristics
of the online learning environment interact with each other to influence learning.
Some researchers were interested in behavioral patterns in the online environ-
ment (Ahern & Durrington, 1995; Davidson-Shivers, Tanner, & Muilenburg,
2000; Kanuka & Anderson, 1998). Because the Internet can be used as a conve-
nient mode of communication between students and between students and instruc-
tors in online courses, most research has been focused on learners' interaction
patterns in the online environment.
Some researchers took advantage of the fact that computers can automatically
record interactions between the user and the machine to study students' learning
behaviors in an online environment. Taraban, Maki, and Rynearson (1999) observed
how students in face-to-face and online classes spent their studying time differ-
ently for classes and for tests. They found that the distribution of study time in the
two conditions was almost identical. Although students knew that an ideal student
should use distributed practice to spread study time over several occasions, they
exclusively studied just before examinations. It seems the convenience of taking
online courses, where students can study whenever they want to, does not change
students' undesirable habit of cramming before being tested.
Maki and Maki (2001) attempted to change students' undesirable last-minute
cramming habit by providing reward-attached online aids to undergraduate stu-
dents taking a psychology course online. In this experimental study, the researchers
manipulated various learning activities that were rewarded. Some students were
rewarded with an opportunity to earn points from mini-quizzes if they viewed "fre-
quently asked questions" (FAQs) posted online, while other students were rewarded
with the opportunity to earn points from the quizzes if they viewed chapter out-
lines posted online. As old behavioral principles show, students' behavior was con-
tingent upon the reward (Skinner, 1938). Students in the condition where viewing
FAQs might be rewarded did view the FAQs more often than did students who
were not rewarded for viewing FAQs. Students in the condition where viewing
chapter outlines might be rewarded viewed the outlines more often than did stu-
dents not rewarded for viewing chapter outlines.
Ridley and Husband (1998) compared GPAs (grade point averages) of students
who completed courses in both traditional and online formats to investigate a
107
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
persistent concern about academic integrity in online learning. They argued that
"remote learners connected to the faculty only through computer networks may have
greater opportunity than ever to turn in work that is not their own" (p. 185). They
hypothesized that cheating in the online courses should be detectable by its effects
on grades: Students in the online courses should have higher GPAs than those in
courses taken in traditional, face-to-face classrooms. What they found was just the
opposite. Students' GPAs in courses in the traditional, face-to-face format were
higher than those in online courses. They concluded that the concern about academic
integrity was either exaggerated or unfounded. However, because of confounding
variables uncontrolled in the comparison, the conclusion was not convincing. The
GPAs could be based on different courses that students took or different tests that
instructors used to measure students' learning in the two instructional environments.
Higher scores could also be the result of superior quality of instruction in courses
taught in regular classrooms. The conclusion that the integrity of online learning was
not exceptionally vulnerable seems especially suspect as the prevalence of online
cheating and plagiarism becomes a major concern for faculty and administrators
engaged in online instruction (McAlister, Rivera, & Hallam, 2002; Olt, 2002).
Affective Domain
In addition to learning outcomes in the cognitive domain, researchers were also
interested in learning outcomes in the affective domain, such as students' attitudes,
satisfaction, and perceptions of the online environment. Some researchers used
descriptive research methods to report students' experiences in online courses
(Althaus, 1997; Edwards & Fritz, 1997; Hansen & Gladfelter, 1996; Richards &
Ridley, 1997; Sullivan, 2002). These researchers were interested in students' per-
ceptions of their own learning experience and perceptions of various learning
activities used in online instruction. College students who were participants in the
studies generally showed positive perceptions of learning outcomes and the learn-
ing environment of online courses and wished that the same or similar online
materials and activities were available in other courses.
More often, researchers have conducted correlational research to investigate the
relationships among characteristics of learners, features of online learning envi-
ronment, and satisfaction of the learners (Bee & Usip, 1998; Gunawardena &
Duphorne, 2001; Mortensen & Young, 2000; Swan, Shea, Fredrickson, Pickett,
Pelz, & Maher, 2001; Wells, 2000). Learners' prior experiences in computer-
related activities such as e-mail and Internet use, their learning styles, and the qual-
ity of their social interactions in an online environment were variables commonly
investigated. Not surprisingly, people with more prior experience and training in
computer-related activities felt more satisfied and comfortable with their experience
in the online environment.
Correlational research also yielded profiles of users and nonusers of online
instruction (Althaus, 1997; Bee & Usip, 1998; Richards & Ridley, 1997; Roblyer,
1999). Richards and Ridley showed that although discrepancies in prior experi-
ences with technology were evident between the user and nonuser groups, those
differences were not a major factor in people's decision to take online courses.
Most students took online courses because they were the only alternative when
they had constraints on their course schedules. Bee and Usip (1998) found that both
users and nonusers agreed that the Intemrnet was a valuable supplement to class lec-
108
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
tures and that e-mail provided a convenient way of communication, but only the
experienced online course users believed that online instruction would improve
their academic performance. Roblyer (1999) found that users of online instruction
valued most the autonomy to determine the pace and timing of the learning process,
whereas nonusers valued more the interaction between students and the instructor
in traditional, face-to-face classrooms.
Learners' Characteristics
109
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
the needs of the learners with the content, is essential for student success. Schrum
recommended that pedagogical, organizational, and institutional issues must be
considered when starting to deliver courses online.
110
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
the lesson, satisfaction and engagement improve. Numerous studies point to pac-
ing as one of the most important incentives for students in choosing online instruc-
tion (Richards & Ridley, 1997; Roblyer, 1999; Wilson & Whitelock 1998). Students
like the opportunity to choose both when and where to learn. Nonetheless, this does
not negate the importance of good instructional design, as Wilson and Whitelock
(1998) pointed out in their study. They indicated that instruction needs some dra-
matic tension from week to week in order to sustain a high level of participation.
Further, they suggested that the instructor needs to facilitate student access to
needed technologies, create a sense of engagement, foster the sharing of informa-
tion, and promote individual gratification. Finally, pedagogical, organizational,
and institutional issues must be considered. For example, the identification of stu-
dent roles and a specific conceptualization of the teacher's role are essential in the
redesign of course delivery systems.
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
112
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
the instructor are not only complicated but subtle. Other studies found that a
learner's preferred learning style affects how her or she uses specific online tools.
Studies have documented that online learners and faculty members alike are com-
plicated and diverse. This is not to say that students and faculty are not adaptable
but that there is simply no one-size-fits-all format when it comes to delivery envi-
ronments. Instructional designers must carefully weigh the user characteristics, the
available faculty, the institutional concerns, and the delivery tool in order to create
an effective instructional experience online.
Institutional Policies
In one well-planned endeavor, researchers used survey and interview data to
determine which benchmarks for online courses recommended by several national
organizations in higher education were actually incorporated in six schools recog-
nized as distance education leaders (Phipps & Merisotis, 2000). One of these
benchmarks was establishing institutional policies for online instruction as well as
for the rest of distance education deliveries. Even though most of the leading insti-
tutions that participated did have university policies for online classes, some of
them had not yet established clear policies for support, course development, and
evaluation.
Institutional Support
Training was one type of support requested by faculty members (Feist 2003;
Rockwell, Schauer, Fritz, & Marx, 1999). In a case study, Feist interviewed
10 instructors who had taught online courses. One finding was that the instruc-
tors wanted training but said that when it was offered many of them did not
take advantage of it. They preferred that it be offered in ways that they could
easily use. For example, they wanted training that could be used right away,
had built-in follow-up, fit into their schedules, matched their learning styles,
focused on curriculum, included leadership and direction from their department
chairs, and included a support person they could all use later. Similarly, in a study
of incentives and obstacles to distance education, Rockwell and her colleagues
(1999) surveyed 139 faculty members and 23 administrators in two universities
in colleges of agriculture (with return rates of 67% and 77%, respectively). They
113
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
found that both groups cited lack of training in how teach online as an obstacle
to participation.
Several studies also showed that faculty members want technical support if they
are to teach online courses (Frith & Kee, 2003; Jennings & Bayless, 2003; Lan,
Tallent-Runnels, Thomas, Fryer, & Cooper, 2003). In a study of communication
effects on student outcomes in a nursing course (Frith & Kee), faculty members said
that they needed technical support and a reliable infrastructure. They believed they
had lost students because of technical problems. Frith and Kee suggested some ways
to avoid technical problem, such as piloting a course site during its development and
beginning the course with a videotaped or face-to-face orientation. It seemed that
students, who were asked to have certain skills to take the course, enrolled without
those skills. In another study, undergraduate students were asked to evaluate their
instructors, identify any technical problems they had experienced, and explain how
these problems affected their learning (Lan et al., 2003). Students used a Likert
scale to rate the level of technical problems they had experienced in each aspect of
the course, such as in chat rooms, discussions, and online tests. They also rated the
importance of each of these aspects to their learning. The combined score for the
two ratings resulted in an impact score calculated by the researchers. Research
results showed a negative correlation between the impact scores and instructor eval-
uations. The more the students experienced technical problems, the lower they rated
their instructors, demonstrating a need for technical support for the courses. Finally,
in a study comparing students in two sections of the same course, one online and
one face-to-face, students in the face-to-face class (who had clear procedures to help
them solve problems) expressed more satisfaction with the course than those who
were online and had technical problems that they could not solve (Jennings &
Bayless, 2003). In the second case, we learned that it was not the presence of tech-
nology problems that caused poor student perceptions of the course but rather the
lack of help to solve the technology problems.
Faculty members expressed a need for course development assistance and a sys-
tem of evaluation and assessment of online education and faculty. Two studies
(Gibson & Herrera, 1999; Zhang, 1998) focused on instructor experiences in
course preparation, and both discussed the need for time for development of online
courses. Gibson and Herrera (1999) conducted a case study of an online bachelor's
degree (Gibson & Herrera, 1999), describing their program development process.
The second case study (Zhang, 1998) included detailed and pertinent data collec-
tion and evaluation information. Faculty members reported that the preparation of
courses was much more time-consuming than they had expected and that they
needed released time (course time that a faculty member is released from to pur-
sue other professional endeavors) for course development. Researchers in both
studies concluded that faculty members need assistance both during the develop-
ment of the courses and during delivery.
In a third study of online course preparation, Dahl's (2003) survey of 428 fac-
ulty members demonstrated that faculty thought they should be paid for the devel-
opment of a course, and agreed that online instruction takes more time than
face-to-face instruction, especially when it came to communication with their stu-
dents. The survey results offered no empirical data to support this conclusion.
Another study queried students in what the author described as an instrumental
case study from a single university (Vallejo, 2001). This qualitative study used
114
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
multiple sources of data and was based on engagement theory. Students in this
study said they wanted administrative support in online courses for grade report-
ing, help with scheduling courses online, online admissions for online students,
appropriate fees for online courses, and tuition payments offered online for the
convenience of the online students and other students (Vallejo, 2001). Clearly the
research on institutional support demonstrates that both students and faculty mem-
bers want technical support. Faculty members also want compensation for course
development and training for course development.
Enrollment Effects
The impact of online courses on enrollments (Ridley, Bailey, Davies, Hash,
& Varner, 1997) also was examined. Surveys sent to students in one university
resulted in a 61% return rate, yielding responses from 129 online course stu-
dents. In addition, data were gathered on patterns of course taking and the rela-
tionship between commuting distance and credits taken. The results revealed no
relationship between the type of courses taken (online or face-to-face) and dis-
tance of the students' homes from campus. Many distance education students
lived in the same town as the campus of the online course, but some did not.
Online courses generated a net increase of 175 credits from students who had
not enrolled in the university before. Ridley et al. deemed this increased credit
hour generation to be a positive factor in online course enrollment. However,
time flexibility seemed an important consideration. These researchers con-
cluded not only that online courses were attractive for students who lived more
than 50 miles away, but that the scheduling of the courses helped the school to
better serve students whom they had served in the past who lived within 50 miles
of the school.
115
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
116
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
require additional compensation for the work if they could get help developing and
delivering courses.
Future Research
117
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
118
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
References
Ahern, T. C., & Durrington, V. (1995). Effects of anonymity and group saliency on
participation and interaction in a computer-mediated small-group discussion. Jour-
nal of Research on Computing in Education, 28, 133-147.
Ahern, T. C., & El-Hindi, A. E. (2000). Improving the instructional congruency of a
computer-mediated small-group discussion: A case study in design and delivery.
Journal of Research on Computing in Education, 32(3), 385-400.
119
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
120
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
121
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
Jones, C. A., & Gower, D. S. (1997). Distance education goes on-line in the 21st cen-
tury. Paper presented at the annual meeting of the Mid-South Educational Research
Association, Memphis, TN.
Kanuka, H., & Anderson, T. (1998). Online social interchange, discord, and knowledge
construction. Journal of Distance Education, 13(1), 57-74.
Katz-Stone, A. (2000). Online learning. Washington Business Journal, 18(38), 35.
Keefe, T. J. (2003). Using technology to enhance a course: The importance of inter-
action. EDUCA USE Quarterly, 1, 24-34.
Knupfer, N. N., Gram, T. E., & Larsen, E. Z. (1997). Participant analysis of a multi-
class, multi-state, on-line, discussion list. (ERIC Document Reproduction Service
No. ED 409-845)
Kozma, R. B. (2004). A reply: Media and methods. Educational Technology Research
and Development, 42, 11-14.
Lan, W. Y., Tallent-Runnels, M. K., Fryer, W., Thomas, J. A., Cooper, S., & Wang, K.
(2003). An examination of the relationship between technology problems and teach-
ing evaluation of online instruction. The Internet and Higher Education, 6, 365-375,
Available at http://authors.elsevier.com/sd/article/S1096751603000563
Lesh, S. G., Guffey, J. S., & Rampp, L. C. (2000). Changes in student attitudes regard-
ing a Web-based health profession course. New York: Institute for Education and
Social Policy. (ERIC Document No. ED 441-386)
Levin, J., Levin, S. R., & Waddoups, G. (1999). Multiplicity in learning and teaching:
A framework for developing innovative online education. Journal of Research on
Computing in Education, 32, 256-268.
Linn, R. L., & Gronlund, N. E. (2000). Measurement and assessment in education
(8th ed.). Columbus, OH: Merrill.
Maki, R. H., Maki, W. S., Patterson, M., & Whittaker, P. D. (2000). Evaluation of a
Web-based introductory psychology course: Learning and satisfaction in on-line ver-
sus lecture courses. Behavior Research Methods, Instruments, & Computers, 32,
230-239.
Maki, W. S., & Maki, R. H. (2001). Mastery quizzes on the Web: Results from a Web
based introductory psychology course. Behavior Research Methods, Instruments, &
Computers, 33, 212-216.
Mayer, R. E., & Chandler, P. (2001). When learning is just a click away: Does simpl
user interaction foster deeper understanding of multimedia messages? Journal o
Educational Psychology, 93, 390-397.
Mayer, R. E., Heiser, J., & Lonn, S. (2001). Cognitive constraints on multimedia learning:
When presenting more material results in less understanding. Journal of Educationa
Psychology, 93, 187-198.
McAlister, M. K., Rivera, J. C., & Hallam, S. F. (2002). Twelve important questions to
answer before you offer a Web based curriculum. Online Journal of Distance Learning
Administration, 4(2). Retrieved February 11, 2003, from http://www.westga
edu/-distance/ojdla/summer42/mcalister42.html
McIssac, M. S., Blocher, J. M., Mahes, V., & Vrasidas, C. (1999). Student and teache
perception of interaction in online computer-mediated communication. Educationa
Media International, 36, 121-131.
McManus, T. F. (2000). Individualizing instruction in a Web-based hypermedia learn
ing environment: Nonlinearity, advance organizers, and self-regulated learners.
Journal of Interactive Learning Research, 11, 219-251.
Mikulecky, L. (1998). Diversity, discussion, and participation: Comparing Web-based an
campus-based adolescent literature classes. Journal of Adolescent & Adult Literacy,
42(2), 84-97.
Minstrell, J., & Stimpson, V. (1996). A classroom environment for learning: Guiding
students' reconstruction of understanding and reasoning. In L. Schauble & R. Glase
122
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
123
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Tallent-Runnels et al.
Schauble, L. (1990). Belief revision in children: The role of prior knowledge and strate-
gies for generating evidence. Journal ofExperimental Child Psychology, 49, 31-57.
Schneider, S. P., & Germann, C. G. (1999). Technical communication on the Web: A
profile of learners and learning environment. Technical Communication Quarterly,
8(1), 37-48.
Schrum, L. (1995). Online courses: What have we learned? Paper presented at the
World Conference on Computers in Education, Birmingham, UK.
Schrum, L., & Hong, S. (2001, April). The potential for electronic educational envi-
ronments: Ensuring student success. Paper presented at the annual meeting of the
American Educational Research Association, Seattle, WA.
Shiratuddin, N. (2001). Internet instructional method: Effects on students' perfor-
mance. Educational Technology & Society, 4(3), 72-76.
Skinner, B. F. (1938). The behavior of organisms. New York: Appleton-Century-Crofts.
Smith, S. B., Smith, S. J., & Boone, R. (2000). Increasing access to teacher prepara-
tion: The effectiveness of traditional instructional methods in an online learning
environment. Journal of Special Education Technology, 15(2), 37-46.
Sonnenwald, D. H., & Li, B. (2003). Scientific collaboratories in higher education:
Exploring learning style preferences and perceptions of technology. British Journal
of Educational Technology, 34(4), 419-431.
Sullivan, P. (2002). "It's easier to be yourself when you are invisible": Female college
students discuss their online classroom experiences. Innovative Higher Education,
27, 129-143.
Swan, K., Shea, P., Fredericksen, E., Pickett, A., Pelz, W., & Maher, G. (2001, April).
Building knowledge building communities: Consistency, contact and communica-
tion in the virtual classroom. Paper presented at the annual meeting of the American
Educational Research Association, Seattle, WA.
Taba, H. (1967). Teacher's handbook for elementary social studies. Reading, MA:
Addison-Wesley.
Taraban, R., Maki, W. S., & Rynearson, K. (1999). Measuring study time distributions:
Implications for designing computer-based courses. Behavior Research Methods,
Instruments, & Computers, 31, 263-269.
Thirunarayanan, M. O., & Perez-Prado, A. (2001/2002). Comparing Web-based and
classroom-based learning: A quantitative study. Journal of Research on Technology
in Education, 34(2), 131-137.
Thomas, M. J. W. (2002). Learning within incoherent structures: The space of online
discussion forums. Journal of Computer Assisted Learning, 18, 351-366.
Trinidad, S., & Pearson, J. (2004). Implementing and evaluating e-learning environ-
ments. Paper presented at Beyond the Comfort Zone: Proceedings of the 21st
ASCILITE Conference, Perth, Australia.
Tyler, R. W. (1950). Basic principles of curriculum and instruction. Chicago: Univer-
sity of Chicago Press.
Vallejo, I. N. (2001). Quality student support services for distance education learners:
Factors contributing to student satisfaction. Paper presented at the annual meeting
of the American Educational Research Association, Seattle, WA.
Vygotsky, L. S. (1978). Mind and society: The development of higher psychological
processes. Cambridge, MA: Harvard University Press.
Waits, T., & Lewis, L. (2003). Distance education at degree-granting postsecondary
institutions, 2000-2001. Retrieved November 19, 2004, from http://nces.ed.gov/
surveys/peqis/publications/2003017/
Warschauer, M. (1998). Online learning in sociocultural context. Anthropology & Edu-
cation Quarterly, 29(1), 68-88.
Weinstein, R. S., & Mignano, A. J. (1997). Elementary classroom management:
Lessons from research and practice (2nd ed.). New York: McGraw Hill.
124
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Teaching Courses Online: A Review of the Research
125
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
IN
APPENDIX
Summary of reviewed studies
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Brown, B. W., & Quantitative Compare learning outcomes 363 traditional co
Liedholm, C. E. (2002) and characteristics of stu- dents, 258 hybr
dents in three different students, 89
modes of instruction on students
the principles of micro-
economics
Brown, S. W., & Quantitative Compare online delivery and 121 participants (26
Kulikowich, J. M. (2004) standard lecture courses 73.6% female) e
graduat
courses
Caywood, K., & Quantitative Compare quizzes, final exam, 75 on-campus adult
Duckett, J. (2003) and rating of teaching 76 online adult stud
practice between students
taking a behavioral man-
agement course online or
on campus
Christel, M. G. (1994) Quantitative Compare multiple versions of 72 college stu
interactive digital video in digita
courses
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
APPENDIX (Continued)
oo
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Gilliver, R. S., Randall, B., & Quantitative Compare students' learning 111 college students Quasi-experim
Pok, Y. M. (1998) with or without online l
supplementary readi
or help sessions
Graff, M. (2003) Quantitative Determine effects of segmen- 50 psychology stud
tation of information and ing psyc
provision of website
overview coupled with
students' cognitive styles
on recall
Greene, B. A., & Qualitative Review ways that students 18 college stu
Land, S. M. (2000) used online scaffolding
(instructional suppo
Gunawardena, C. N., & Qualitative Examine adults' experience 50 students fr
Duphorne, P. L. (2001) of learning in an online univ
course
Hansen, N. E., & Gladfelter, Mixed-method Describe and evaluate 9 years 47 gradua
J. (1996) of experimentation with conte
online courses env
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
0
APPENDIX (Continued)
Author(s) and year Method Purpose of study Sample
Jennings, S. E., & Quantitative Determine whether there are Upper-level unde
Bayless, M. L. (2003) differences in GPA, ages of in a traditional bu
students, and student suc- course (47
cess between students in a business co
traditional class and in an
online class
Jones, C. A., & Quantitative Survey institutions on using All Tennessee 2-
Gower, D. S. (1997) distance education institution
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Maki, W. S., & Maki, R. H. Quantitative Examine the relationship 311 college
(2001) between online course
activities and
Mayer, R. E., & Chandler, Quantitative Determine whether control 59 college
P. (2001) over the pace of instruction
resulted in better tr
and retention of material
Mayer, R. E., Heiser, J., & Quantitative Investigate effects of redun- College
Lonn, S. (2001) dancy and coherence of
information on students'
learning
McIssac, M. S., Blocher, Mixed-method Examine the role of interac- Unde
J. M., Mahes, B., & tion in online courses i
Vrasidas, C. (1999) envir
McManus, T. F. (2000) Quantitative Examine effects of linearity 119 college
and self-regulation on
learning cha
Mikulecky, L. (1998) Qualitative Compare discussion formats 29 graduate stude
in an online and a face-to- mixtu
face class baccalaureate an
gradu
Parker, D., & Gemino, A. Quantitative Compare teaching effective- 128 online and
(2001) ness measured by perfor- face students
mance on a final compr
hensive test and two sub-
scores on conceptual
knowledge (multiple-choice
items) and technique appli-
cation (business cases),
using cumulative data
throughout five consecutive
Cr
semesters (two semesters
C)3 online and three semesters
Clr
face-to-face)
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
APPENDIX (Continued)
C-?r
C~
E3
Pearson, J., & Trinidad, S. Mixed-method Compare online learning 10 secondary bus
(2004) environment to student ers in Hong Kong a
preferences
Peterson, C. L., & Bond, N. Mixed-method Compare student learning 38 online and 49
(2004) between program online students in a teacher
and face-to-face ration progr
Phipps, R., & Merisotis, J. Qualitative Review benchmarks for qua
(2000) ity online courses 16 faculty and administra-
tors,
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Roblyer, M. D. (1999) Mixed-method Provide baseline data on 27 high school
whether attitude factors 33 comm
and personal characteristics stud
exist that can predict
dents' choice of course
delivery systems
Rockwell, S. K., Schauer, J., Quantitative Survey faculty's incentives 237 facultie
Fritz, S. M., & Marx, D. B. and obstacles for delivering to
(1999) online courses
Schneider, S. P., & Germann, Qualitative Describe the learning envi- 441 college st
C. G. (1999) ronments provided by university, 259 s
interactive technology college)
Schrum, L. (1995) Qualitative Identify significant issues in One graduate cou
the development and pre- 3
sentation of online cou
Schrum, L., & Hong, S. Mixed-method I
(2001) characterize successful descrip
online learner
Shiratuddin, N. (2001) Quantitative Examine effect of Internet on 169 college
students' performan
Smith, S. B., Smith, S. J., & Quantitative Examine effects of instruc- 58 colleg
Boone, R. (2000) tional mode and instru
tional method on students'
learning
Sonnenwald, D., & Li, B. Quantitative Explore learning style prefer- 40 upperclassmen, un
(2003) ences and perceptions of graduates
technology
Sullivan, P. (2002) Qualitative Discover whether online 21 female stu
courses offer a more
female-friendly classro
Swan, K., Shea, P., et al. Quantitative Investiga
(2001) course design related to
CbJ students' learnin
satisfaction
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
(C) APPENDIX (Continued)
Author(s) and year Method Purpose of study Sample
Taraban, R., Maki, W. S., & Quantitative Compare patterns of students' 99 colleg
Rynearson, K. (1999) studying time in online and
traditional classro
Thirunarayanan, M. 0. (2002) Qu
learning of core concepts ESL
and ideas in a Teaching
English as a Second Lan-
guage course between
online and face-to-face
instruction
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms
Wilson, T., & Mixed-method Assess distance learners' per- 106 undergrad
Whitelock, D. ceptions of collaboration sis/
(1998) and group work in an learner
online environment
Winograd, D. (2000) Qualitative Explore how a trained moder- 30 undergraduate
ator affects students in
computer conferences
online course
Young, S. S. (2004) Mixed-method Investigate the teacher role in Those who
School for All (free courses pleted
taught by volunteer teach- courses
ers for no credit) (2000-2002
Zemsky, R., & Massy, W. F. Qualitative Chart how the mark
(2004) e-learning was changing tion and 6 corporation
over time and what it with 15 facult
would be like in the future istrators, an
from each
Zhang, P. (1998) Qualitative Describe actual use of dis- 1 instructor, 1 teachin
tance education technolo- tant, 15
gies during design
delivery of a graduat
distance course
IL
This content downloaded from 204.187.23.196 on Mon, 12 Sep 2016 09:28:15 UTC
All use subject to http://about.jstor.org/terms