Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Online Learning Performance and Satisfaction: Do Perceptions and Readiness Matter?

Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

Distance Education

ISSN: 0158-7919 (Print) 1475-0198 (Online) Journal homepage: https://www.tandfonline.com/loi/cdie20

Online learning performance and satisfaction: do


perceptions and readiness matter?

Huei-Chuan Wei & Chien Chou

To cite this article: Huei-Chuan Wei & Chien Chou (2020) Online learning performance and
satisfaction: do perceptions and readiness matter?, Distance Education, 41:1, 48-69, DOI:
10.1080/01587919.2020.1724768

To link to this article: https://doi.org/10.1080/01587919.2020.1724768

Published online: 16 Feb 2020.

Submit your article to this journal

Article views: 7611

View related articles

View Crossmark data

Citing articles: 130 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=cdie20
DISTANCE EDUCATION
2020, VOL. 41, NO. 1, 48–69
https://doi.org/10.1080/01587919.2020.1724768

ARTICLE

Online learning performance and satisfaction: do perceptions


and readiness matter?
Huei-Chuan Wei and Chien Chou
Institute of Education, National Chiao Tung University, Hsinchu, Taiwan, Republic of China

ABSTRACT ARTICLE HISTORY


The current study proposes a comprehensive structural model to Received 26 October 2018
determine whether online learning perceptions and online learning Accepted 30 January 2020
readiness affect students’ online learning performance and course KEYWORDS
satisfaction. A questionnaire was voluntarily completed by 356 online learning perceptions;
undergraduate students enrolled in a cross-campus, general educa- student performance;
tion, asynchronous online course in Taiwan. The structural equation learning readiness; course
modeling analyses indicated that students’ computer/Internet self- satisfaction
efficacy and motivation for learning exerted a direct, positive effect
on their online discussion score and course satisfaction. Furthermore,
it was found that students’ computer/Internet self-efficacy for online
learning readiness had a mediated effect not only on online learning
perceptions and online discussion score but also on online learning
perceptions and course satisfaction. The findings of this research are
helpful for both academics and practitioners of online learning to
design online courses that particularly emphasize computer/Internet
self-efficacy.

Introduction
Owing to the rapid increase of Internet use, online learning has become one of the fastest
growing trends in educational uses of technologies (Bates, 2019; Cigdem & Yildirim, 2014). Not
only have institutions of higher education increasingly offered online courses, the number of
students enrolled in online courses is also rapidly rising. Online learning has also become an
emerging field of research. Many studies have aimed to explore the factors underlying the
success or failure of online learning (e.g., Bolliger & Halupa, 2018; Shelton et al., 2017; Yang
et al., 2017) or to investigate the critical factors influencing learner satisfaction in an online
learning environment (e.g., Dziuban et al., 2015; Liaw & Huang, 2013; Weidlich & Bastiaens,
2018). Previous studies (e.g., Horzum et al., 2015; Kuo, 2014) have determined factors deemed
to be influential on student performance or course satisfaction in online learning environ-
ments; however, little research has been performed to explore the relationships among
various factors (e.g., students’ general online learning perceptions, online learning readiness,
performance, and course satisfaction). Furthermore, researchers have usually used the final
grade or grade point average, indicating the individual’s overall learning performance, as
a dependent variable in their studies (e.g., Bernard et al., 2004; Hao, 2016; Lu et al., 2003).

CONTACT Huei-Chuan Wei juanwei.ie97g@g2.nctu.edu.tw


© 2020 Open and Distance Learning Association of Australia, Inc.
DISTANCE EDUCATION 49

Research has noted, however, that an online course could include various learning
activities, such as initiating discussions in an online discussion forum, conducting
a personal assignment or group project, or attending exams in a physical or virtual
classroom. Hence, some studies have defined students’ learning performance as the
combination of different scores, for example, the actual number of student postings to
the online discussion board, examination scores, and/or assignment scores (e.g., Picciano,
2002; Wei & Chou, 2019; Wei et al., 2015). The online course in the current study consisted
of one course orientation, two paper-and-pencil examinations, weekly online discussion,
and a group project. Therefore, three scores represent students’ learning achievement:
the online discussion score, the exam score, and the group project score. In accordance
with the course design, the present study used all three scores as dependent variables
instead of using only the final score. The current study used structural equation modeling
analysis to investigate how students’ online learning perceptions and their readiness
contribute to their learning performance and course satisfaction. The findings of the
analysis should better inform educational researchers and practitioners in how to guide
students with different perceptions and levels of readiness to achieve a better online
learning experience.
Furthermore, this study has a twofold significance. Firstly, it corresponds with a series of
grand programs proposed by the Taiwan Ministry of Education in the past 10 years. These
programs have continuously improved the overall quality of universities and promote
a multidimensional development of higher education (Hou, 2017). One of the overarching
goals is to cultivate students to be independent and self-directed learners. Probing online
learning readiness may be a key strategy to help students to enhance their active-learning
effectiveness and learn independently. Secondly, the online course in this study was an
18-week general-education undergraduate course with two credit hours, which made this
course different from massive open online courses (MOOCs) or any other informal online
courses. The course structure included careful organization and sequencing of learning
content and a variety of learning activities, assignments, and multiple forms of evaluation.
Understanding the relationships among students’ perception of online learning, their
online learning readiness, their online learning performance, and their course satisfaction
could help instructional designers and instructors to design quality online courses.

Literature review
Online learning perceptions
The concept of online learning perceptions derives from attitudes toward computers and
the Internet. Research (e.g., Alzahrani & O’Toole, 2017; Joyce & Kirakowski, 2015; Wei &
Chou, 2019) has shown that students’ attitudes toward computers are important to their
future use of such technology in instructional settings. Likewise, students’ attitudes
toward the Web and web-based instruction can influence their future use of instructional
materials provided on the Web and could ultimately affect how educationally beneficial
web-based resources are to students in an online learning environment.
Researchers from various disciplines are interested in enhancing students’ online
learning experience, and one issue of concern is that of learners’ perceptions of online
learning (e.g., Al-Samarraie et al., 2018; Sun et al., 2008). For example, Liaw et al. (2007)
50 H.-C. WEI AND C. CHOU

reported that college students’ perceptions of the online learning environment included
four dimensions: e-learning as a self-paced learning environment, e-learning as an effec-
tive learning environment, e-learning as a multimedia instruction environment, and
e-learning as an instructor-led learning environment. Similarly, Wei and Chou (2019) not
only focused on the features of online learning (e.g., flexibility, adaptability, convenience,
interaction) and viewed these features as factors of online learning perceptions but also
applied the self-determination theory (SDT; Deci & Ryan, 1985) as a theoretical framework
to understand college learners’ perceptions of and personal motivational factors for
online learning. Hence, Wei and Chou (2019) developed 21 statements categorized into
five subdimensions to fit the framework of the SDT. Two subdimensions (learning needs,
personal learning preference) were grouped into the dimension of perceived competence,
two subdimensions (personal learning enjoyment, flexibility) were grouped into the
dimension of perceived autonomy, and one subdimension (interaction) fit the dimension
of perceived relatedness from the SDT framework.
According to the above literature, several factors can be viewed as considerations for
learners’ perceptions of online learning. For example, online learning provides learners
a more flexible and convenient learning environment to conduct self-paced and custo-
mized learning. Furthermore, online learning improves the quality and quantity of inter-
action among students, instructors, and peers through synchronous and asynchronous
communicational technologies. In sum, the features or advantages that learners perceive
in online learning environments may include flexibility, synchronous and asynchronous
interaction with peers and instructors, lack of time and place restrictions, and easy access
to diverse online content and professional knowledge. Therefore, the concept of online
learning perceptions in this study refers to learners’ recognition of the abovementioned
features or benefits of online learning. It is proposed that the more positive learners’
online learning perceptions are, the more they will perceive support and benefits from
their online learning process. However, by reviewing the studies on online learning
perceptions, the concept of perceptions toward online learning still needed to be con-
firmed and the dimensions underlying the perceptions needed to be identified. Therefore,
this study tried to develop a more appropriate framework of online learning perceptions
and a suitable instrument to measure learners’ online learning perceptions.
Are students’ online learning perceptions related to their learning performance?
Research (e.g., De Paepe et al., 2018; Ke & Kwak, 2013) has studied what key components
might influence student performance. For example, studies by Bertea (2009) and Morris
(2011) have indicated that college students’ perceptions of and experiences with the
Internet can critically influence their online learning performance. Duggan et al. (2001)
explored the relationship between college students’ perceptions of the educational use of
the Internet and their Internet behaviors. The results showed that if students’ perceptions
of the educational use of the Internet were more positive, they displayed a higher
tendency to choose a class that required Internet use or one that made use of it more
frequently for educational purposes. Yang and Lin (2010) proposed that in an online
writing exchange program, learners’ Internet perceptions were closely correlated with
their online learning participation, indicating that students with more positive Internet
perceptions were more likely to post their writings or respond to others’ writings on the
online forum. In addition, Paechter and Maier (2010) investigated the aspects of online
courses that students experience as being favorable for learning. Data were collected
DISTANCE EDUCATION 51

from 2,196 students in 29 Austrian universities. The findings suggest that students
appreciated online learning for its potential to provide well-structured learning materials
that support self-regulated learning. Another result was that students appreciated online
components for the opportunities they offered for exercising and applying one’s knowl-
edge and for applying metacognitive self-regulation strategies such as monitoring one’s
own learning progress. Similarly, Wei and Chou (2019) investigated the relationships
among learners’ online learning perceptions, learning behaviors, and learning achieve-
ments in an online course. The results showed that the flexibility dimension significantly
affected the learners’ times of viewing the learning materials, and the personal learning
preference dimension directly affected the learners’ online discussion score. Interestingly,
this study found that learners’ online learning perceptions did not have a statistically
significant influence on exam scores. The times of viewing the learning materials sig-
nificantly mediated the effect of the flexibility dimension on learners’ online discussion
score and on learners’ exam score.
Past research, such as the abovementioned studies, has usually focussed on exploring
learners’ perceptions of online learning and has found some type of relationship between
perceptions and performance. However, a few studies have indicated that learners’
perceptions of online learning could influence learners’ course satisfaction. For instance,
Sahin and Shelly (2008) proposed that if students have the skills to use online tools and
perceive that online learning is useful and has flexibility in interpersonal communication
and information sharing, their course satisfaction will be promoted. Therefore, in this
study, not only did the dimensions of learners’ online learning perceptions still need to be
rechecked and confirmed but also how the online learning perceptions influence learners’
performance and course satisfaction needed to be explored. Therefore, this study aimed
to identify the concept of online learning perceptions and to explore the relationships
among online learning perceptions, online learning performance, and course satisfaction.

Online learning readiness


The notion of online learning readiness was proposed by Warner et al. (1998). To concretize
readiness concepts, researchers have developed instruments for measuring online learning
readiness. For example, McVay (2000) developed a 13-item instrument to measure students’
readiness for online learning. Later, to identify and confirm McVay’s questionnaire, Smith
et al. (2003) suggested that readiness for online learning might be composed of two factors:
the student’s comfort with the learning resources available in a learning sequence and the
degree of self-direction. Bernard et al. (2004) developed a 38-item questionnaire (including
McVay’s original 13 items) to tap into four dimensions of readiness: beliefs about distance
education, confidence about basic prerequisite skills, self-direction and initiative, and desire
for interaction with the instructor and other students.
However, to assess readiness for online learning, other facets, such as technical efficacy
in computer use, self-control efficacy, and Internet navigation skills, must also be
addressed. For example, Dray et al. (2011) developed an instrument for students to self-
assess readiness for online learning. This instrument measured four dimensions: basic
technological skills, such as the ability to use specific applications in specific ways (e.g.,
email, Internet, spreadsheet, and documents); access to technology, including owning
technology and having Internet connectivity; the usage of technology, such as nature and
52 H.-C. WEI AND C. CHOU

frequency of use; and the relationship with information and communications technology,
such as beliefs, values, confidence, and comfort with technology. Yu and Richardson
(2015), in particular, viewed social integration as another important factor in online
learning readiness and further proposed the student online learning readiness conceptual
model with four factors: social competencies with classmates, social competencies with
instructor, communication competencies, and technical competencies.
In addition to identifying and developing instruments to measure student readiness,
some researchers have investigated factors that affect online learning outcomes. For
example, Hung et al. (2010) developed a multidimensional instrument to assess college
students’ readiness for online learning with five dimensions: self-directed learning, motiva-
tion for learning, computer/Internet self-efficacy, learner control, and online communication
self-efficacy. Their results revealed that online student levels of readiness were high in
computer/Internet self-efficacy, motivation for learning, and online communication self-
efficacy but low in learner control and self-directed learning. Hung (2012) further examined
the relationship between online learners’ readiness and learning performance and found
that student readiness was not a powerful construct in explaining learning performance.
Keramati et al. (2011) proposed a conceptual model to examine the role of readiness in the
relationship between online learning factors and online learning outcomes. The results
showed that readiness factors act as moderating variables in the relationship between
online learning factors and online learning outcomes. Additionally, organizational readiness
factors have a significant effect on students’ online learning outcomes.
In addition, online learning readiness seems to be an important factor affecting
students’ learning performance in online courses. However, researchers’ efforts to explore
the relationship between students’ online learning perceptions and their online learning
readiness remain limited. Moreover, research has shown that readiness for online learning
can be a multifaceted concept (e.g., McVay, 2000; Hung, 2016; Hung et al., 2010; Keramati
et al., 2011) that includes factors such as computer-use skill efficacy, self-control efficacy,
and online communication self-efficacy.
To fulfill the purpose of this study and to gain a comprehensive understanding of
students’ readiness for online learning, this study adopted the framework of online
learning readiness proposed by Hung et al. (2010), which consists of self-directed learning,
motivation for learning, computer/Internet self-efficacy, learner control, and online com-
munication self-efficacy.

The relationships among online learning perceptions, online learning readiness,


and online course satisfaction
Student course satisfaction is another important measure in online courses. What is online
course satisfaction? In fact, online course satisfaction is a multidimensional construct that
includes various factors (e.g., course structure, educational activities, curriculum, instruc-
tor knowledge and facilitation, instructor presence, instructor feedback, instructional
style) (Eichelberger & Ngo, 2018; Li et al., 2016). Since the definition of online course
satisfaction is complex and multidimensional, this study explored learners’ online course
satisfaction including instructional style, learning contents and course structures, instruc-
tors and teaching assistants, discussion forum, the examinations, and the overall course.
Researchers have explored what specific factors influence student online course
DISTANCE EDUCATION 53

satisfaction. For example, Chow and Shi (2014) found that university students’ perceptions
of e-learning, including perceived flexibility and motivation, have the most significant
impact on satisfaction with e-learning. In addition, Paechter et al. (2010) noted that
student–instructor online interaction is an important factor influencing students’ satisfac-
tion in online learning environments. Similarly, McFarland and Hamilton (2005) found that
most students are satisfied with the discussion board feature in online learning environ-
ments. In other words, students view online discussion as an important learning activity in
an online course. This viewpoint is consistent with the findings proposed by Lee et al.
(2011), who reported that students’ perceptions of teacher and peer interaction (e.g.,
providing learning assistance or answering questions) were significantly related to stu-
dents’ overall satisfaction with the online course. The consistent message from the above
studies is that students’ perceptions of online learning could influence their online course
satisfaction.
Additionally, Stokes (2003) found that student online course satisfaction was influ-
enced by the students’ degree of comfort with using the Internet. Sahin and Shelley
(2008) reported that students were more satisfied with online courses when they had the
skills to use online tools and perceived online learning as a useful and flexible way of
learning, communicating, and sharing. Similarly, Kuo et al. (2013) indicated that Internet
self-efficacy was a good predictor of student satisfaction in an online course. It is worth
noting that, in the study by Kuo et al. (2013), self-regulated learning did not contribute to
students’ course satisfaction. In other words, this finding is contrary to previous research
that found self-regulation to be a critical variable to student satisfaction.
Although the studies above mentioned that Internet self-efficacy, communication, and
interaction could influence students’ satisfaction, few studies have investigated the effect
of overall learning readiness on students’ satisfaction in online learning environments.
Yilmaz’s (2017) study proposed that students’ e-learning readiness is a significant pre-
dictor of their course satisfaction and motivation in a flipped classroom model of instruc-
tion. Hence, more research may be needed to verify the effect of overall learning readiness
on students’ course satisfaction in online learning environments.
In sum, students’ online course satisfaction is a multidimensional construct and is
influenced by many self-perceived factors involved when students consider individual
online courses, such as computer and Internet self-efficacy, interaction with instructors
or peers, overall perceptions of online courses, and overall learning readiness for
online learning. The studies described above warrant further study into how student
course satisfaction is related to students’ online learning perceptions and online
learning readiness.

Research hypotheses
Based on the foregoing literature review, the purpose of this study was to use structural
equation modeling to investigate the relationships among online learning perceptions,
online course performance, and course satisfaction by testing the mediating effect of five
online learning readiness factors (Hung et al., 2010). A representation of the model tested
in this study is illustrated in Figure 1. Furthermore, this hypothetical model, derived from
the literature review, was used to test the following hypotheses:
54 H.-C. WEI AND C. CHOU

Figure 1. The hypothesized model.

● H1. College students’ online learning perceptions significantly and positively affect
online learning readiness.
● H2. College students’ online learning perceptions significantly and positively affect
online learning performance, which includes online discussion score, midterm exam/
final exam score (exam score), and group project score.
● H3. College students’ online learning perceptions significantly and positively affect
online course satisfaction.
● H4. College students’ online learning readiness significantly and positively affects
online learning performance, which includes their online discussion score, midterm
exam/final exam score (exam score), and group project score.
● H5. College students’ online learning readiness significantly and positively affects
online course satisfaction.
● H6. College students’ online learning readiness mediated the relationship between
online learning perceptions and online learning performance.
● H7. College students’ online learning readiness mediates the relationship between
online learning perceptions and online course satisfaction.

Methodology
Participants
This study consisted of a sample of 356 undergraduate students from three universities in
Taiwan enrolled in a cross-campus online course. Of the 356 participants, 234 (65.7%)
DISTANCE EDUCATION 55

were male, and 122 (34.3%) were female. Thirty-seven (10.4%) were freshmen, 166 (46.6%)
were sophomores, 125 (35.1%) were juniors, and 28 (7.9%) were seniors.
Regarding participants’ previous experience with online courses, 177 (49.7%) partici-
pants had enrolled in other online courses before this online course, and 176 (49.4%)
participants had never enrolled in online courses (missing values for three students).
The study was approved by the Research Ethics Committee for Human Subject
Protection of National Chiao Tung University before the research instrument was distrib-
uted to the targeted samples (Application No. NCTU-REC-107-031).

Educational setting and design


All participants in the current study were enrolled in a general-education undergraduate
course named “Internet Literacy and Ethics” in a university in Taiwan. This course was
delivered via a self-developed course management system called the e3-system. Students
were informed of the course format before they took this cross-campus course. The course
consisted of 13 lessons along with the orientation, a midterm exam, a group project,
weekly online discussion, and a final exam in an 18-week semester. Each lesson provided
hypermedia presentations, including streaming video and audio linked to assigned texts
and supplementary materials. Featuring digital learning materials such as videos and
slides, this course required students to post questions and comments on discussion
forums every week throughout the whole semester and to complete a group project.
In the first week of the course, students and instructors had a face-to-face class meet-
ing. Within the opening contact, the instructors supplied basic course introductory
information and their contact information. After the face-to-face meeting, all students
were required to post a self-introduction to the online discussion forum so that they could
get to know one another. They were also required to read the materials every week and to
post their thoughts on the content covering that weekly material in online discussion. The
instructors emphasized peer interaction, thus requiring students to share their own
knowledge, skills, and experience and to help each other by posting questions and
responding to fellow students’ questions and postings. The instructors responded to
the students and provided comments as well. Prior to the beginning of course, the
instructors needed to design the course, including topic selection, course requirements,
and learning activities. Two instructors were responsible for designing discussion topics,
responding to individual postings, and providing general comments in the online discus-
sion forums. Moreover, in order to make sure students understood the materials, students
had to take two paper-and-pencil exams—one at midsemester (in Week 9) and the other
at the end of semester (in Week 17) in a designated classroom.

Instruments
Online learning perception scale (OLPS)
Because there was neither an existing theoretical foundation nor a related scale that the
present study could use, we developed statements based on the reviewed literature. The
initial OLPS comprised 37 statements with no opposite-scaled items. All items were
reviewed by two scholars with backgrounds of online learning to ensure that students
could understand the meaning of the items. In addition, three undergraduate students
56 H.-C. WEI AND C. CHOU

voluntarily helped check the items to make sure that college students could understand
the item expressions well. This 37-item scale was constructed with a 5-point Likert-type
response format consisting of values ranging from strongly disagree (1) to strongly agree
(5). In the OLPS, higher total scale scores indicated more positive online learning percep-
tions, while lower total scale scores indicated less positive perceptions. To re-categorize
these items into distinct factors (dimensions) and to ensure the OLPS construct validity,
we conducted a series of exploratory factor analysis (EFA). The principle axis factoring
analysis method was used with a promax oblimin rotation. The item was retained when its
load was greater than 0.40 and had an eigenvalue greater than 1. Items that contributed
to at least two factors, with the difference between the two factors lower than 0.4, were
deleted. The Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy was 0.88, which
suggests that the items were suitable for factor analysis (Kaiser & Rice, 1974).
The EFA results revealed a five-factor structure: accessibility, interactivity, adapt-
ability, knowledge acquisition, and ease of loading. Hence, the initial 37 items were
reduced to 23 items. There were 4, 6, 4, 5, and 4 items in these five factors,
respectively, and together they accounted for 53.22% of variance. Table 1 shows
the factor loadings of the 23 items relative to the five factors on the OLPS.
Specifically, the dimension of accessibility referred to students’ perceptions of unlim-
ited and free access to online learning materials or resources. The dimension of
interactivity concerned students’ perceptions of interaction with instructors or peers,
including asking questions, sharing different opinions, and discussing learning topics
or issues. The dimension of adaptability aimed to investigate students’ perceptions of
their own control over their learning (e.g., deciding where and when to learn,
controlling their learning, which manifested itself as repeating or skipping some
learning content). The dimension of knowledge acquisition referred to students’
perceptions of the ability to acquire the knowledge that they seek to expand their
horizons. The ease of loading dimension referred to students’ perceptions of lower
learning stress or burden in online learning environments.
The intercorrelation of items was determined using the reliability coefficient of
Cronbach’s alpha to estimate the internal consistency of OLPS items. The reliability of
the 23-item scale was 0.90. As shown in Table 1, the reliability of each factor was as
follows: accessibility, 0.88; interactivity, 0.79; adaptability, 0.86; knowledge acquisition,
0.80; and ease of loading, 0.80. In short, the EFA and Cronbach’s alpha analysis show that
the constructed OLPS is a scale with more than decent validity and reliable internal
consistency.

Online learning readiness scale (OLRS)


The OLRS, developed by Hung et al. (2010), was employed (with their permission)
to measure college students’ readiness for online learning in the current study. This
5-point Likert scale consisted of 18 items and five factors, consisting of computer/
Internet self-efficacy, self-directed learning, learner control, motivation for learning,
and online communication self-efficacy. After confirmatory factor analysis (CFA) was
performed, the analysis was initiated with 18 items, and the initial estimation was
not satisfactory but was acceptable. An adjunct discrepancy-based fit index may
serve as the ratio of chi-square to degrees of freedom (χ2/df). A χ2/df ratio value of
less than 5 indicates an acceptable fit between a hypothesized model and sample
DISTANCE EDUCATION 57

Table 1. Factor loadings of the 23 items of the five factors on the OLPS (n = 356).
Factor loading
Factor Factor Factor Factor Factor
No. Item 1 2 3 4 5
1 Online learning provides various multimedia learning resources. .909 -.060 -.029 .033 .018
2 Online learning provides various online resources. .897 -.019 -.079 .014 -.040
3 Online learning enables me to retrieve and obtain more learning .817 .011 .013 .011 .052
resources.
4 Online learning enables me to share and exchange resources. .429 .142 .254 -.006 -.027
5 Online learning enables me to interact directly with other learners. .019 .764 -.038 -.045 -.024
6 Online learning can encourage interaction between instructors and -.086 .692 -.039 .002 .053
students.
7 Online learning can shorten the distance between instructors and -.212 .647 -.045 .112 .045
students.
8 Online learning enables me to meet more classmates or peers with .109 .587 .057 -.057 -.057
the same interests or habits.
9 Online learning provides sufficient discussion opportunities. .149 .553 -.026 .017 -.020
10 Online learning provides convenient tools to communicate with .174 .431 .127 -.006 -.007
other learners.
11 Online learning enables me to decide on the best time to learn. -.099 -.024 .968 -.030 .015
12 Online learning enables me to decide on the best location to learn. -.025 -.031 .926 -.015 -.035
13 Online learning enables me to repeatedly review learning materials. .063 -.002 .615 .121 .012
14 Online learning overcomes time and place constraints. .169 .027 .479 -.015 .138
15 Online learning can broaden my common knowledge base. .004 -.055 .085 .775 -.109
16 Online learning enables me to learn more about the knowledge that .144 -.080 -.058 .714 .004
I desire to learn.
17 Online learning can expand my academic knowledge capacity. -.052 .055 .110 .667 -.114
18 Online learning is an effective learning style. -.019 .071 -.062 .570 .162
19 Online learning enables an abstract idea or concept to be presented .008 .081 -.096 .536 .113
in a concrete manner.
20 Online learning environments lead to less pressure to catch up with .057 -.079 -.010 -.065 .817
a course schedule.
21 Online learning environments are less stressful. -.022 -.076 .128 .090 .712
22 Online learning environments place less pressure on exams and .012 .058 -.063 -.077 .702
assessments.
23 Online learning environments can effectively reduce learning -.069 .123 .048 .074 .545
burden.
Factor correlations
Factor 1: Accessibility (α = 0.88) –
Factor 2: Interactivity (α = 0.79) .404 –
Factor 3: Adaptability (α = 0.86) .594 .291 –
Factor 4: Knowledge acquisition (α = 0.80) .568 .483 .573 –
Factor 5: Ease of loading (α = 0.80) .389 .309 .572 .538 –
Notes. Total items = 23; KMO = .88; Cronbach’s alpha = .90; Total variance explained = 53.22%

data (Schumacker & Lomax, 2004). In the present study, the measurement model,
χ2/df = 3.37, indicates that this proposed model may have an acceptable fit. The
other indices for model-fit evaluation were root-mean-square of approximation
(RMSEA) = .082, comparative fit index (CFI) = .894, and standardized root-mean-
square residual (SRMR) = .065.
Each item has a standard loading between 0.502 and 0.856 on the five factors, and each
loading was statistically significant. In addition, as shown in Table 2, the composite
reliabilities (CR) of five constructs ranged from 0.697 to 0.849, indicating adequate internal
consistency of the measurement model (Bagozzi & Yi, 1988). The average variance
extracted (AVE), ranging from 0.443 to 0.653, indicated that the scales have good con-
vergent validity (Bagozzi & Yi, 1988; Hair et al., 2010). Appendix A contains the ques-
tionnaire items.
58 H.-C. WEI AND C. CHOU

Table 2. Reliability, AVE, and CR of CFA of OLRS.


Measures Items CR AVE
Computer/Internet self-efficacy (CIS) 3 0.747 0.497
Self-directed learning (SDL) 5 0.823 0.485
Learner control (LC) 3 0.697 0.443
Motivation for learning (ML) 4 0.794 0.494
Online communication self-efficacy (OCS) 3 0.849 0.653

Online course satisfaction scale (OCSS)


Although a number of studies have explored students’ course satisfaction in online
learning environments and have developed their own course satisfaction surveys,
there was no existing theoretical foundation nor related scale that this current study
could use. To collect students’ satisfaction data for online courses, we administered
a standardized web-based OCSS to the students. The relationships among OLPS,
OLRS, OCSS, and student online learning performance were then examined. This
standardized web-based survey, with good validity and reliability, has been used
for 10 years to assess students’ online course satisfaction with every online course at
our university. Before using the scale, we received the online course instructors’
permission in this study. The OCSS was conceptualized using seven items referring
to the level of general contentment with the learning experience related to instruc-
tors and course design in the course (see Table 3). This scale was measured on a five-
point Likert scale, ranging from strongly disagree (1) to strongly agree (5). Students
answered each item in the context of the course in which they were enrolled. For
example, a high rating on the item “I am satisfied with the learning content and
course structure” indicated that students expressed a high satisfaction level with the
materials and course design in this particular course.
To check the quality of the instrument focussing on students’ course satisfaction,
we conducted principle axis factoring analysis. The KMO measure of sampling ade-
quacy was 0.91, exceeding the suggested threshold for factor analysis of 0.6
(Tabachnick & Fidell, 2019). In the EFA results, one factor was extracted with an
eigenvalue greater than 1 and effectively explained 56.29% of the total variance.
Cronbach’s alpha coefficient was used to determine the instrument’s internal reliability
after the data collection phase had concluded. The 7-item scale’s reliability was .90. As
with the results of the EFA and Cronbach’s alpha analysis, the OCSS was shown to
have more than decent validity and internal consistency reliability. Table 3 shows the
factor loadings of the 7 items relative to the one factor on the OCSS.

Table 3. Factor loadings of the 7 items of the one factor on the OCSS (n = 356).
Factor Corrected item-to-total
No. Items loading correlation
1 I am satisfied with the instructional style. .875 .867
2 I am satisfied with the learning content and course structure. .796 .816
3 I am satisfied with the instructors and teaching assistants. .774 .804
4 I am satisfied with the use of online discussion forum. .735 .765
5 I am satisfied with the group projects for the course assignment and the .721 .779
criteria of group projects.
6 I am satisfied with the midterm exam and final exam. .706 .772
7 Overall, I am satisfied with this course. .618 .707
Notes. Total items = 7; KMO = .91; Cronbach’s alpha = .90; Total variance explained = 56.29%
DISTANCE EDUCATION 59

Data collection
Prior to the beginning of the study, all the students were informed about the data
collection methods and were provided with details in the informed consent form in the
first week of the online course; they were also assured that all responses were confidential.
In addition, all the students, on a voluntary basis, wrote their identification numbers on the
questionnaire so that afterward, we could match students’ questionnaire responses with
performance scores such as online discussion score, group project score, or exam score.
The two paper-and-pencil scales (OLPS and OLRS) were handed out with the midterm
exam in the classroom each semester. The OCSS was delivered during the final classroom
exam each semester. We gathered data, including students’ questionnaire responses and
online learning performance scores, in an identical course format and with identical
content for three consecutive semesters from 2017 to 2018. A total of 416 questionnaires
were distributed, and 356 responses matching the learning-outcome data were usable for
further analyses.
We used three scores to represent three types of students’ online learning perfor-
mance: the online discussion score, the midterm exam/final exam score (exam score), and
the group project score. The three scores stemmed from data covering five consecutive
semesters and rested on the same course requirements: the score for online discussion
counted as 30% of the final score, the score for the midterm and final exams counted as
50% of the final score, and the score for the group project counted as 20% of the final
score. The following information provides details regarding the three types of scores:

● Online discussion score: The measurement of online discussion performance was


based on the number and quality of messages posted. Each student was required
to make at least two posts per week, including responses to the discussion threads
posted by the instructors and by their peers; furthermore, each student had to post at
least two messages in one semester for the purpose of extending peer discussion.
Two instructors and three teaching assistants scored the quality of messages based on
two criteria, clarity and reasonableness, as announced in the first week of the course.
● Midterm exam/final exam score (exam score): This score was the average of the
midterm and final exams. All students were required to take these paper-and-pencil
exams in designated classrooms.
● Group project score: The aforementioned two instructors and three teaching assis-
tants assessed the students’ group projects based on a set of criteria covering
consistency, accuracy, completeness, creativity, and organization. These criteria
were announced during the first week of the course.

Data analysis
This study examined a path model using structural equation modeling via MPlus with the
maximum likelihood estimation method (Muthén & Muthén, 2010). Online learning
performance, including the online discussion score, midterm exam/final exam score
(exam score), and group project score, and course satisfaction served as continuous
dependent measured variables. Online learning perceptions and online learning readi-
ness were viewed as an independent variable and a mediator, respectively, in this study.
60 H.-C. WEI AND C. CHOU

The following mediational paths were identified: (a) indirect paths from online learning
perceptions to online learning performance through online learning readiness and (b)
indirect paths from online learning perceptions to course satisfaction through online
learning readiness.
The following fit indices were used to determine the goodness of fit of the structural
equation model:

● the chi-square test of model fit (χ2 values closer to zero and not significant indicate
a better-fitting model) (Jöreskog & Sörbom, 1997; Kline, 2011);
● the χ2/df ratio (Schumacker & Lomax, 2004), with a value less than 5 indicating
acceptable fit;
● RMSEA (Browne & Cudeck, 1993), with a value less than .08 indicating reasonable fit;
● CFI (Browne & Cudeck, 1993; Hu & Bentler, 1999; Kline, 2011), with a value of higher
than .90 or .95 indicating acceptable fit;
● SRMR, with a value of .08 or less indicating an acceptable model (Hu & Bentler, 1999).

Results
As a result, the model chi-square = 1501.506 (p < .001) indicates a poor fit. However,
certain problems exist when relying solely on chi-square statistics because the chi-square
test has been indicated as being sensitive to sample size. Owing to the large sample size
of this study (n = 356), we further discuss other indices to evaluate model fit. In the
present study, χ2/df = 2.14, indicating that the hypothesized model may have an accep-
table fit (Schumacker & Lomax, 2004). The other indices for model-fit evaluation were
RMSEA = 0.073, CFI = 0.778, and SRMR = 0.095. Although the hypothesized model did not
have sufficiently good fit to all indices for model-fit evaluation, we kept the hypothesized
model to test the relationships among these variables (including OLPS, OLRS, OCSS, and
online learning performance).
To test the hypotheses, direct effects among online learning perceptions, online
learning readiness, online course satisfaction, and online learning performance were
examined at the alpha level of .05 by reviewing the β weights. Figure 2 presents the
standardized path coefficients of the final structural model.

The direct effects among online learning perceptions, online learning readiness,
online course satisfaction, and online learning performance
Our analysis showed that online learning perceptions significantly and positively affected
the dimensions of computer/Internet self-efficacy (β = .633, SE = .070, p < .001), self-
directed learning (β = .426, SE = .072, p < .001), learner control (β = .585, SE = .072, p <
.001), motivation for learning (β = .786, SE = .047, p < .001), and online communication self-
efficacy (β = .519, SE = .066, p < .001), thereby supporting H1. In other words, in this study,
college students’ online learning perceptions significantly and positively affected their
online learning readiness. Students with higher and positive online learning perception
(e.g., perceived ease of loading in online courses, perceived accessibility of online learning
resources) felt more confident and were readier to participate in online courses.
DISTANCE EDUCATION 61

Figure 2. The final structural model with standardized path coefficients.

However, we found that the path from online learning perceptions to online learning
performance and course satisfaction was not significant, which failed to support H2 and
H3. In other words, in this study, college students’ online learning perceptions did not
significantly directly affect their online learning performance and course satisfaction.
Regardless of whether students had higher and positive online learning perceptions,
their online learning performance (online discussion score, exam score, and group project
score) and overall online course satisfaction were not influenced by these perceptions.

The direct effects among online learning readiness, online course satisfaction, and
online learning performance
For online learning readiness, the dimension of computer/Internet self-efficacy had
positive effects on online discussion score (β = .524, SE = .119, p < .001) and course
satisfaction (β = .318, SE = .130, p < .05) but not on the exam score and group project
score. The dimension of motivation for learning had positive effects on the online
discussion score (β = .318, SE = .154, p < .05) but not on the exam score, the group
project score, or course satisfaction. The effects of the self-directed learning, learner
control, and online communication self-efficacy dimensions on online learning perfor-
mance and course satisfaction were insignificant. Therefore, H4 and H5 were partly
supported because only the dimensions of computer/Internet self-efficacy and motiva-
tion for learning were reported as significant predictors. In other words, in this study,
college students who self-reported their computer/Internet self-efficacy (e.g., confidence
in how to manage software for online learning, confidence in using the Internet to find or
gather information for online learning) more positively as higher had higher online
discussion score and course satisfaction. In addition, college students who self-reported
their motivation for learning (e.g., sharing ideas with others in the discussion board) as
higher had higher online discussion score.
62 H.-C. WEI AND C. CHOU

The mediating effects among online learning perceptions, online learning


readiness, online course satisfaction, and online learning performance
In addition to the analysis of direct effects, the mediating effects were examined using the
Sobel test (Baron & Kenny, 1986; Krull & Mackinnon, 2001). The results showed that online
learning perceptions positively and significantly affected online discussion score through
computer/Internet self-efficacy (β = .744, SE = .230, p < .01), indicating partial support for
H6. Moreover, the dimension of computer/Internet self-efficacy significantly mediated the
effect of online learning perceptions on course satisfaction (β = .344, SE = .159, p < .05),
which partially supported H7. In other words, in this study, when college students’ online
learning perceptions were higher and positive, they felt more confident in their compu-
ter/Internet self-efficacy, which further influenced their online discussion score and their
online course satisfaction. Thus, computer/Internet self-efficacy serves as an essential
mediator in the relationship among online learning perceptions, online discussion per-
formance, and course satisfaction.
Finally, together, the independent variables explained 26.9% of online discussion
score, 4.1% of exam score, 14.3% of group project score, and 11% of course satisfaction.
To conclude, regarding direct effects, H1 was supported, while H4 and H5 were
partially supported. Regarding mediating effects, H6 and H7 were partially supported
through the results for H4 and H5 that reported the significant effects of computer/
Internet self-efficacy and motivation for learning on online discussion score and of
computer/Internet self-efficacy on course satisfaction.

Discussion and conclusion


As part of this research, we explored the effect levels among the latent variables of online
learning perceptions, online learning readiness (including computer/Internet self-efficacy,
self-directed learning, learner control, motivation for learning, and online communication
self-efficacy), online learning performance, and online course satisfaction. In the following
section, we offer responses to the research questions and discuss three noteworthy
findings in relation to the literature.
First, we found that online learning perceptions have a significant effect on online
learning readiness. The more positive a student’s perceptions are, the readier he/she is for
online learning. This finding is in accordance with the results of previous studies identify-
ing the influencing factors of readiness. Smith et al. (2003) proposed that online learning
readiness would be influenced by comfort with online learning and by the self-
management of learning. Our study adds to the empirical evidence indicating that
students’ positive perceptions contribute to their higher readiness for online learning.
However, the evidence showed that online learning perceptions failed to generate
a significant direct effect on learners’ online learning performance or their course satisfac-
tion. This finding is contrary to those of previous studies (e.g., Bernard et al., 2004; Morris,
2011). One possible reason might be that our learners’ online learning perceptions
represent a more general perception rather than that of any specific online learning
activity or requirement (such as an online discussion, an exam, and a group project) or
course satisfaction. In other words, although students may have positive perceptions of
online learning in general (i.e., accessibility, interactivity, adaptability, knowledge
DISTANCE EDUCATION 63

acquisition, and ease of loading), those positive perceptions did not transfer to their
performance in specific learning activities or to overall course satisfaction.
Second, this study sought to investigate whether students’ online learning readi-
ness exerted any observable effect on their online learning performance. In this
study, we found that computer/Internet self-efficacy has a positive effect on online
discussion score. Notably, the findings of the current study are different from those
of prior studies (e.g., Bernard et al., 2004; Hung, 2012) indicating that computer/
Internet self-efficacy is not a good indicator to predict students’ learning perfor-
mance. One possible reason is that prior studies used only one final course grade to
represent student achievement but did not focus on the respective performance in
individual activities, unlike our approach in this course. The online discussion in this
study required students to post every week, including responses to the discussion
issues posted by the instructors and their peers. Therefore, students’ computer/
Internet self-efficacy might be more positive; that is, they are familiar with posting
messages on the discussion board and interacting with peers and instructors and
thus perform better in online discussion.
In addition, students’ computer/Internet self-efficacy also has a direct effect on
course satisfaction. This finding is consistent with that of Bolliger and Halupa (2012).
Our findings suggest that students who feel more confident in using computers or the
Internet report higher course satisfaction. In the context of our online course, because
almost all learning activities must be completed using some type of computer/Internet
tool, performing well in the course might be easier for students with high confidence
in using these tools.
In addition to technology-related self-efficacy, students’ motivation for learning also
has a direct effect on the online discussion score. It is possible for all learning activities of
this course to be completely conducted online. Instructors do not need to inform students
about posting messages on the discussion board or the deadlines for handing in assign-
ments. Students have to post messages on the discussion board every week to interact
with peers and instructors. In other words, if students have a higher motivation for
learning and enjoy sharing ideas with others online, they participate more actively in
the discussion. Thus, their online discussion score is much better.
Finally, the results of our mediation analysis indicate that computer/Internet self-
efficacy mediated online learning perceptions in terms of the online discussion score
and course satisfaction. This possible causation indicates that students’ computer/
Internet self-efficacy for online learning readiness could mediate the effects of students’
online learning perceptions on both their online discussion score and their course
satisfaction. In other words, students’ online learning perceptions influenced their com-
puter/Internet self-efficacy for online learning readiness, which in turn influenced their
online discussion score and course satisfaction.
In sum, this study found that online learning perceptions are powerful predictors of
online learning readiness. Additionally, this study rediscovered the role of online compu-
ter/Internet self-efficacy in online learning readiness as a good example of the intersection
among online learning perceptions, online discussion performance, and course satisfac-
tion. In other words, students’ online discussion performance and course satisfaction are
indirectly influenced by their online learning perceptions through their computer/
Internet self-efficacy.
64 H.-C. WEI AND C. CHOU

Limitations, recommendation, and implications


Research limitations and recommendations for future research
This study has some limitations and thus can provide some recommendations for
future research. First, the findings of this study should not be generalized to other
online learning settings because it was conducted with a relatively small group of
undergraduate students, and the responses to the survey cannot necessarily be
considered the opinions of all the students. Second, although the constructs of the
OLRS and the OLPS in this study provided a reasonable structural model to help
comprehensively understand the relationships between their results and performance
and course satisfaction in online learning environments, these scales still have room
to improve. In future studies, the instruments should be revised, and a proper
measurement model should be proposed. Third, in addition to the data collected
from self-reported questionnaires, it would be greatly preferable in the future to
obtain various data sources that provide direct, rich information, such as actual
participation records in an online course, to reduce sample bias and possible mea-
surement errors; obtaining qualitative data, such as interview data, to further confirm
the quantitative data would also be helpful. Moreover, caution must be exercised in
generalizing the results of this study to other contexts such as blended learning
environments or other subject domains of online learning.

Pedagogical implications
This study aimed to investigate the relationships among college students’ online learning
perceptions, online learning readiness, course satisfaction and online learning perfor-
mance. In addition to the theoretical contributions, this study provides some pedagogical
implications for online course instructors and educational system developers.
For example, in the first week of the online course, in addition to introducing
course formats, assignments, and requirements, online course instructors may help
students perceive online learning more positively (e.g., by promoting the features of
online learning such as emphasizing students’ ability to decide on the best time and
location to learn). Instructors may also invite those who have previously enrolled in
this online course to share their experiences to help students improve their online
learning readiness (i.e., computer/Internet self-efficacy, self-directed learning, learner
control, motivation for learning, online communication self-efficacy). Moreover,
because online discussion is an important activity in online courses, instructors not
only need to respond to messages students posted and give feedback but also
should further encourage and guide students, especially those with lower compu-
ter/Internet self-efficacy, to participate more extensively in the online course, to
search the Internet to obtain more learning resources, to use the course manage-
ment system, and to seek assistance when facing problems online. With such
guidance, students will perform better in their online discussion activities and
experience more course satisfaction in the online course as well. Lastly, instructors
should help students remain motivated since motivation is one of the important
factors influencing online discussion performance and course satisfaction. For
DISTANCE EDUCATION 65

instance, instructors can increase students’ intrinsic motivation by promoting the


features of online learning (e.g., online learning can provide more channels for
students to interact with instructors and peers, students can broaden their horizons
and retrieve more learning resources via online learning) and by encouraging stu-
dents to actively participate in the online course. Students’ extrinsic motivation may
become stronger if extra credit is given for more postings on the discussion board or
their active participation is recognized.
For educational system developers, it is crucial to consider the system design to
support students’ online learning. Since the current study found that computer/Internet
self-efficacy and motivation for learning have direct effects on online discussion score and
course satisfaction, educational system developers should focus on designing simple
system interfaces and easy-to-use system functions, especially for discussion boards.
Such simplicity would help students feel more confident and perhaps feel less pressure
to participate in the online course, which would increase their willingness to express their
thoughts, interact with instructors and peers, and obtain assistance or supportive inter-
vention when they face problems or feel discouraged during the process of online
learning. From another perspective, if the design of the educational system enables
instructors to personalize the functions according to their course design—for example,
the ability to send students frequent reminders about the deadlines, requirements, and
tests (through email or instant messages over cell phones) from the system—students
may be especially encouraged to participate in the online course.

Acknowledgments
This research was partially supported by Grant 105-2511-S-009-008-MY3 from the Ministry of
Science and Technology, Taiwan, Republic of China. The author is indebted to the anonymous
reviewers and the editor for their insightful comments in revising this manuscript.

Disclosure statement
No potential conflict of interest was reported by the authors.

Notes on contributors
Huei-Chuan Wei is a post-doctoral fellow at the Institute of Education, National Chiao Tung
University, Taiwan, R.O.C. Her research interests include open and distance learning, interaction
and learning performance, instructional design, digital literacy, and research ethics.
Chien Chou is a Chair Professor at the Institute of Education, National Chiao Tung University, Taiwan,
R.O.C. Her research interests include e-learning, information literacy and ethics, and research ethics.

ORCID
Huei-Chuan Wei http://orcid.org/0000-0001-8587-6439
Chien Chou http://orcid.org/0000-0002-3258-1036
66 H.-C. WEI AND C. CHOU

References
Al-Samarraie, H., Teng, B. K., Alzahrani, A. I., & Alalwan, N. (2018). E-learning continuance satisfaction
in higher education: A unified perspective from instructors and students. Studies in Higher
Education, 43(11), 2003–2019. https://doi.org/10.1080/03075079.2017.1298088
Alzahrani, M. G., & O’Toole, J. M. (2017). The impact of Internet experience and attitude on student
preference for blended learning. Journal of Curriculum and Teaching, 6(1), 65–78. https://doi.org/
10.5430/jct.v6n1p65
Bagozzi, R. P., & Yi, Y. (1988). On the evaluation of structural equation models. Journal of the Academy
of Marketing Science, 16(1), 74–94. https://doi.org/10.1007/bf02723327
Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psycho-
logical research: Conceptual, strategic, and statistical considerations. Journal of Personality and
Social Psychology, 51(6), 1173–1182. https://doi.org/10.1037/0022-3514.51.6.1173
Bates, A. W. (2019). Teaching in a digital age: Guidelines for designing teaching and learning (2nd ed.).
Tony Bates Associates Ltd. https://opentextbc.ca/teachinginadigitalage/
Bernard, R. M., Brauer, A., Abrami, P. C., & Surkes, M. (2004). The development of a questionnaire for
predicting online learning achievement. Distance Education, 25(1), 31–47. https://doi.org/10.
1080/0158791042000212440
Bertea, P. (2009, April 9–10). Measuring students’ attitude towards e-learning: A case study.
Proceedings of the 5th International Scientific Conference on eLearning and Software for
Education. https://adlunap.ro/else_publications/papers/2009/979.1.Bertea.pdf
Bolliger, D. U., & Halupa, C. (2012). Student perceptions of satisfaction and anxiety in an online
doctoral program. Distance Education, 33(1), 81–98. https://doi.org/10.1080/01587919.2012.
667961
Bolliger, D. U., & Halupa, C. (2018). Online student perceptions of engagement, transactional
distance, and outcomes. Distance Education, 39(3), 299–316. https://doi.org/10.1080/01587919.
2018.1476845
Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. A. Bollen & J. S. Long
(Eds.), Testing structural equation models (pp. 130–159). Sage.
Chow, W. S., & Shi, S. (2014). Investigating students’ satisfaction and continuance intention toward
e-learning: An extension of the expectation-confirmation model. Procedia - Social and Behavioral
Sciences, 141, 1145–1149. https://doi.org/10.1016/j.sbspro.2014.05.193
Cigdem, H., & Yildirim, O. (2014). Effects of students’ characteristics on online learning readiness:
A vocational college example. Turkish Online Journal of Distance Education, 15(3), 80–93. https://
doi.org/10.17718/tojde.69439
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior.
Plenum. https://doi.org/10.1007/978-1-4899-2271-7
De Paepe, L., Zhu, C., & Depryck, K. (2018). Learner characteristics, learner achievement and time
Investment in online courses for Dutch L2 in adult education. Turkish Online Journal of
Educational Technology, 17(1), 101–112. http://www.tojet.net/articles/v17i1/17110.pdf
Dray, B. J., Lowenthal, P. R., Miszkiewicz, M. J., Ruiz-Primo, M. A., & Marczynski, K. (2011). Developing
an instrument to assess student readiness for online learning: A validation study. Distance
Education, 32(1), 29–47. https://doi.org/10.1080/01587919.2011.565496
Duggan, A., Hess, B., Morgan, D., Kim, S., & Wilson, K. (2001). Measuring students’ attitudes toward
educational use of the Internet. Journal of Educational Computing Research, 25(3), 267–281.
https://doi.org/10.2190/gtfb-4d6u-ycax-uv91
Dziuban, C., Moskal, P., Thompson, J., Kramer, L., DeCantis, G., & Hermsdorfer, A. (2015). Student
satisfaction with online learning: Is it a psychological contract? Online Learning, 19(2). https://doi.
org/10.24059/olj.v19i2.496
Eichelberger, A. & Ngo, H. T. P. (2018). College students’ perception of an online course in special
education. International Journal for Educational Media and Technology, 12(2), 11–19. http://jaems.
jp/contents/icomej/vol12-2/02_Eichelberger_Ngo.pdf
Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (Eds.). (2010). Multivariate data analysis (7th ed.).
Pearson.
DISTANCE EDUCATION 67

Hao, Y. (2016). Middle school students’ flipped learning readiness in foreign language classrooms:
Exploring its relationship with personal characteristics and individual circumstances. Computers
in Human Behavior, 59, 295–303. https://doi.org/10.1016/j.chb.2016.01.031
Horzum, M. B., Kaymak, Z. D., & Gungoren, O. C. (2015). Structural equation modeling towards online
learning readiness, academic motivations, and perceived learning. Educational Sciences: Theory &
Practice, 15(3), 759–770. https://doi.org/10.12738/estp.2015.3.2410
Hou, Y. C. (2017). Excellence, research and quality system in Taiwan higher education and challenges
for internationalization. http://www.acup.cat/sites/default/files/angelathe-higher-education-and-
research-system-taiwan.pdf
Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis:
Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary
Journal, 6(1), 1–55. https://doi.org/10.1080/10705519909540118
Hung, M. L. (2012). A study of the relationship between college-level online learners’ readiness and
learning performance [Unpublished doctoral dissertation]. National Chiao Tung University.
Hung, M. L. (2016). Teacher readiness for online learning: Scale development and teacher
perceptions. Computers & Education, 94, 120–133. https://doi.org/10.1016/j.compedu.2015.11.012
Hung, M. L., Chou, C., Chen, C. H., & Own, Z. Y. (2010). Learner readiness for online learning: Scale
development and student perceptions. Computers & Education, 55(3), 1080–1090. https://doi.org/
10.1016/j.compedu.2010.05.004
Jöreskog, K. G., & Sörbom, D. (Eds.). (1997). LISREL 8: User’s reference guide (2nd ed.). Scientific
Software International.
Joyce, M., & Kirakowski, J. (2015). Measuring attitudes towards the Internet: The general Internet
attitude scale. International Journal of Human–Computer Interaction, 31(8), 506–517. https://doi.
org/10.1080/10447318.2015.1064657
Kaiser, H. F., & Rice, J. (1974). Little Jiffy, Mark Iv. Educational and Psychological Measurement, 34(1),
111–117. https://doi.org/10.1177/001316447403400115
Ke, F., & Kwak, D. (2013). Online learning across ethnicity and age: A study on learning interaction
participation, perception, and learning satisfaction. Computers & Education, 61, 43–51. https://doi.
org/10.1016/j.compedu.2012.09.003
Keramati, A., Afshari-Mofrad, M., & Kamrani, A. (2011). The role of readiness factors in E-learning
outcomes: An empirical study. Computers & Education, 57(3), 1919–1929. https://doi.org/10.1016/
j.compedu.2011.04.005
Kline, B. R. (Ed.) (2011). Principles and practice of structural equation modeling (3rd ed.). The Guilford
Press.
Krull, J. L., & Mackinnon, D. P. (2001). Multilevel modeling of individual and group level mediated
effects. Multivariate Behavioral Research, 36(2), 249–277. https://doi.org/10.1207/
S15327906MBR3602_06
Kuo, Y. C. (2014). Accelerated online learning: Perceptions of interaction and learning outcomes
among African American students. American Journal of Distance Education, 28(4), 241–252.
https://doi.org/10.1080/08923647.2014.959334
Kuo, Y. C., Walker, A., Belland, B. & Schroder, K. (2013). A predictive study of student satisfaction in
online education programs. The International Review of Research in Open and Distributed Learning,
14(1), 16–39. https://doi.org/10.19173/irrodl.v14i1.1338
Lee, S. J., Srinivasan, S., Trail, T., Lewis, D., & Lopez, S. (2011). Examining the relationship among
student perception of support, course satisfaction, and learning outcomes in online learning. The
Internet and Higher Education, 14(3), 158–163. https://doi.org/10.1016/j.iheduc.2011.04.001
Li, N., Marsh, V., & Rienties, B. (2016). Modeling and managing learner satisfaction: Use of learner
feedback to enhance blended and online learning experience. Decision Sciences Journal of
Innovative Education, 14(2), 216–242. https://doi.org/10.1111/dsji.12096
Liaw, S. S., & Huang, H. M. (2013). Perceived satisfaction, perceived usefulness and interactive
learning environments as predictors to self-regulation in e-learning environments. Computers &
Education, 60(1), 14–24. https://doi.org/10.1016/j.compedu.2012.07.015
Liaw, S. S., Huang, H. M., & Chen, G. D. (2007). Surveying instructor and learner attitudes toward
e-learning. Computers & Education, 49(4), 1066–1080. https://doi.org/10.1016/j.compedu.2006.01.001
68 H.-C. WEI AND C. CHOU

Lu, J., Yu, C. S., & Liu, C. (2003). Learning style, learning patterns, and learning performance in a
WebCT-based MIS course. Information & Management, 40(6), 497–507. https://doi.org/10.1016/
S0378-7206(02)00064-2
McFarland, D., & Hamilton, D. (2005). Factors affecting student performance and satisfaction: Online
versus traditional course delivery. Journal of Computer Information Systems, 46(2), 25–32. https://
doi.org/10.1080/08874417.2006.11645880
McVay, M. (2000). Developing a web-based distance student orientation to enhance student success in
an online bachelor’s degree completion program [Paper presentation]. Unpublished practicum
report presented to the Ed. D. program, Nova Southeastern University, FL, United States.
Morris, T. A. (2011) Exploring community college student perceptions of online learning.
International Journal of Instructional Technology and Distance Learning, 8(6), 31–44.
Muthén, L. K., & Muthén, B. O. (Eds.). (2010). Mplus user’s guide (6th ed.). Muthén & Muthén. https://
www.statmodel.com/download/usersguide/Mplus%20Users%20Guide%20v6.pdf
Paechter, M., & Maier, B. (2010). Online or face-to-face? Students’ experiences and preferences in
e-learning. The Internet and Higher Education, 13(4), 292–297. https://doi.org/10.1016/j.iheduc.
2010.09.004
Paechter, M., Maier, B., & Macher, D. (2010). Students’ expectations of, and experiences in e-learning:
Their relation to learning achievements and course satisfaction. Computers & Education, 54(1),
222–229. https://doi.org/10.1016/j.compedu.2009.08.005
Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction, presence, and performance
in an online course. Journal of Asynchronous Learning Networks, 6(1), 21–40. http://doi.org/10.
24059/olj.v6i1.1870
Sahin, I., & Shelley, M. (2008). Considering students’ perceptions: The distance education student
satisfaction model. Journal of Educational Technology & Society, 11(3), 216–223. http://www.jstor.
org/stable/jeductechsoci.11.3.216
Schumacker, R. E., & Lomax, R. G. (2004). A beginner’s guide to structural equation modeling (2nd ed.).
Erlbaum. https://doi.org/10.1016/j.evalprogplan.2005.01.006
Shelton, B. E., Hung, J. L., & Lowenthal, P. R. (2017). Predicting student success by modeling student
interaction in asynchronous online courses. Distance Education, 38(1), 59–69. https://doi.org/10.
1080/01587919.2017.1299562
Smith, P. J., Murphy, K. L., & Mahoney, S. E. (2003). Towards identifying factors underlying readiness
for online learning: An exploratory study. Distance Education, 24(1), 57–67. https://doi.org/10.
1080/01587910303043
Stokes, S. P. (2003, November 5–7). Temperament, learning styles, and demographic predictors of
college student satisfaction in a digital learning environment. In J. Petry, L. Allen, & E. Welch (Eds.),
Proceedings of the Twenty-Sixth Annual Meeting of the Mid-South Educational Research Association.
Mid-South Educational Research Association. https://files.eric.ed.gov/fulltext/ED482454.pdf
Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-Learning? An
empirical investigation of the critical factors influencing learner satisfaction. Computers &
Education, 50(4), 1183–1202. https://doi.org/10.1016/j.compedu.2006.11.007
Tabachnick, B. G., & Fidell, L. S. (Eds.). (2019). Using multivariate statistics (7th ed.). Pearson.
Warner, D., Christie, G., & Choy, S. (Eds.). (1998). The readiness of VET clients for flexible delivery
including on-line learning. Australian National Training Authority.
Wei, H. C., & Chou, C. (2019, April 5–9).Relationships among college learners’ online learning percep-
tions, behaviors, and achievements via the Self-Determination Theory approach [Paper presenta-
tion]. 2019 AERA Annual Meeting, Toronto, Canada.
Wei, H. C., Peng, H., & Chou, C. (2015). Can more interactivity improve learning achievement in an
online course? Effects of college students’ perception and actual use of a course-management
system on their learning achievement. Computers & Education, 83, 10–21. https://doi.org/10.1016/
j.compedu.2014.12.013
Weidlich, J., & Bastiaens, T. J. (2018). Technology matters – The impact of transactional distance on
satisfaction in online distance learning. The International Review of Research in Open and
Distributed Learning, 19(3). https://doi.org/10.19173/irrodl.v19i3.3417
DISTANCE EDUCATION 69

Yang, D., Baldwin, S., & Snelson, C. (2017). Persistence factors revealed: Students’ reflections on
completing a fully online program. Distance Education, 38(1), 23–36. https://doi.org/10.1080/
01587919.2017.1299561
Yang, Y., & Lin, N. C. (2010). Internet perceptions, online participation and language learning in
Moodle forums: A case study on nursing students in Taiwan. Procedia - Social and Behavioral
Sciences, 2(2), 2647–2651. https://doi.org/10.1016/j.sbspro.2010.03.388
Yilmaz, R. (2017). Exploring the role of e-learning readiness on student satisfaction and motivation in
flipped classroom. Computers in Human Behavior, 70, 251–260. https://doi.org/10.1016/j.chb.
2016.12.085
Yu, T., & Richardson, J. C. (2015). An exploratory factor analysis and reliability analysis of the student
online learning readiness (SOLR) instrument. Online Learning, 19(5), 120–141. https://doi.org/10.
24059/olj.v19i5.593

Appendix A

ORLS dimensions and items (Hung et al., 2010)


Item no. Dimension/items
Computer/Internet self-efficacy (CIS)
CIS1 I feel confident in performing the basic functions of Microsoft Office programs (MS Word, MS Excel, and
MS PowerPoint).
CIS2 I feel confident in my knowledge and skills of how to manage software for online learning.
CIS3 I feel confident in using the Internet (Google, Yahoo) to find or gather information for online learning.
Self-directed learning (SDL)
SDL1 I carry out my own study plan.
SDL2 I seek assistance when facing learning problems.
SDL3 I manage time well.
SDL4 I set up my learning goals.
SDL5 I have higher expectations for my learning performance.
Learner control (LC)
LC1 I can direct my own learning progress.
LC2 I am not distracted by other online activities when learning online (instant messages, Internet surfing).
LC3 I repeated the online instructional materials on the basis of my needs.
Motivation for learning (ML)
ML1 I am open to new ideas.
ML2 I have motivation to learn.
ML3 I improve from my mistakes.
ML4 I like to share my ideas with others.
Online communication self-efficacy (OCS)
OCS1 I feel confident in using online tools (email, discussion) to effectively communicate with others.
OCS2 I feel confident in expressing myself (emotions and humor) through text.
OCS3 I feel confident in posting questions in online discussions.

You might also like