Computers & Education 57 (2011) 1445–1457
Contents lists available at ScienceDirect
Computers & Education
journal homepage: www.elsevier.com/locate/compedu
Students’ views of collaboration and online participation in Knowledge Forum
Carol K.K. Chan*, Yuen-Yan Chan
Faculty of Education, University of Hong Kong, Pokfulam, Hong Kong, SAR, China
a r t i c l e i n f o
a b s t r a c t
Article history:
Received 11 May 2010
Received in revised form
28 August 2010
Accepted 7 September 2010
This study examined students’ views of collaboration and learning, and investigated how these predict
students’ online participation in a computer-supported learning environment. The participants were 521
secondary school students in Hong Kong, who took part in online collaborative inquiry conducted using
Knowledge ForumÔ. We developed a questionnaire to assess the students’ views of their collaboration
aligned with the knowledge-building perspective. We also administered the Learning Process Questionnaire to examine their preferred approaches to learning. The students’ online participation in
Knowledge Forum was examined using the Analytic Toolkit software. Analyses indicated that students
who viewed their collaboration as more aligned with collaborative knowledge building were more likely
to employ a deep approach to learning. A structural equation model indicated that the students’ views of
collaboration exerted a direct effect on online participation in Knowledge Forum and mediated the
effects of deep approaches on forum participation. Implications of examining students’ views of
collaboration for productive online participation are discussed.
Ó 2010 Elsevier Ltd. All rights reserved.
Keywords:
Knowledge building
Student beliefs
Computer-supported collaborative learning
Online participation
1. Introduction
Significant advances have been made in designing computer-supported collaborative learning (CSCL) environments and examining their
effects on student cognition, learning, and understanding (Jacobson & Kozma, 2000; Linn, Davis, & Bell, 2004; Stahl, Koschmann, & Suthers,
2006). Despite this progress, the educational benefits of computer-supported learning are still questionable (Dillon, 2000; Roschelle & Pea,
1999; Voogt & Knezek, 2008). Although psychological research has shown the importance of learner characteristics, less attention has been
paid to them in the context of web-based learning. There is now increased recognition among researchers that technology-based environments need to be informed by theories of learning (Chan & van Aalst, 2008; Lai, 2008). Simply putting students together in a CSCL
environment does not necessarily result in collaboration and learning (Kreijns, Kirschner, & Jochems, 2003); online forum participation is
often found to be scattered and fragmented (Hewitt, 2005; Lipponen, Rahikainen, Lallimo, & Hakkarainen, 2003).
A substantial body of psychological research has demonstrated that students’ learning outcomes are influenced by what they believe
about their learning and how they approach it. For example, studies in approaches to learning (Biggs, 1999), conceptions of learning (Marton
& Booth, 1997), beliefs about learning (Law, Chan, & Sachs, 2008) and epistemological beliefs (Mason & Scirica, 2006) all show how these
constructs predict learning, cognition and academic performance. On the other hand, despite the widespread use of CSCL environments, less
attention has been given to learners’ beliefs and approaches to collaboration, and how they might affect student participation in computersupported collaborative environments.
There is now increased recognition that students’ conceptions of and attitudes towards web-based learning are important prerequisites to
effective web-based instruction (Tsai, 2009). While the literature includes an increasing number of studies on student preferences and attitudes towards web-based learning (e.g., Peng, Tsai, & Wu, 2006; Tsai, Lin, & Tsai, 2001; Yang & Tsai, 2008), fewer have specifically attempted to
link students’ beliefs about and approaches to learning and collaboration with online participation. The key argument of this paper is that it is
important to consider students’ beliefs and approaches to collaboration and learning for examining online participation. For example, if
students view learning as an individual matter, they would be less likely to engage in collaboration. As well, if students believe that collaborative writing in computer forums merely involves the sharing of information, they would engage in superficial moves rather than deep
* Corresponding author. Tel.: þ852 28591906; fax: þ852 2858 5649.
E-mail addresses: ckkchan@hku.hk (C.K.K. Chan), yychan8@hku.hk (Y.-Y. Chan).
0360-1315/$ – see front matter Ó 2010 Elsevier Ltd. All rights reserved.
doi:10.1016/j.compedu.2010.09.003
1446
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
inquiry for knowledge co-construction. Given the widespread use of computer-supported collaborative environments, it is timely to understand how students view collaboration when working on computer forums. Accordingly, this study examines students’ beliefs and approaches
to collaboration and learning, and investigates whether these views predict student participation in Knowledge Forum Ô.
2. Theoretical background
2.1. Collaborative knowledge building
Computer-supported collaborative learning (CSCL) has emerged as a major research field over the last decade (Stahl et al., 2006). Of
particular interest is the “collaboration” element of CSCL, which distinguishes it from general e-learning practices (Stahl et al., 2006). In
CSCL, participants do not simply react in isolation to digitized materials; rather, they communicate, interact, and co-construct knowledge
within the CSCL environment. In this study, we examine student collaboration from the theoretical perspective of collaborative knowledge
building, a forerunner of CSCL (Scardamalia & Bereiter, 1994; 2006).
The conceptual framework of knowledge building focuses on collective cognitive responsibility and engaging students in a community to
create new knowledge guided by online discourses mediated by the computer forum (Bereiter, 2002; Scardamalia & Bereiter, 2006). This
theoretical perspective has evolved alongside the development of Computer-Supported Intentional Learning Environments (CSILE) and,
subsequently, Knowledge Forum (KF). CSILE and Knowledge Forum are networked learning environments designed using socio-cognitive
and socio-technological dynamics to support knowledge advances among participants. A forerunner of CSCL, CSILE first appeared as
a prototype introduced in a university course in 1983, predating the advent of the World Wide Web. The non-Web-based CSILE was reengineered into Knowledge Forum in 1995, alongside with the rapid evolution of the World Wide Web (Scardamalia & Bereiter, 2006).
Knowledge Forum, with both a web-based and a graphical-based interface, is an asynchronous discourse medium in which students coconstruct knowledge mediated by their discourse. Students post ideas and comments through classroom- and computer-based discourse to
generate questions, co-construct explanations, and revise their ideas for advancing community knowledge.
Scardamalia (2002) has elaborated a set of twelve principles or determinants to help characterize the complex socio-cognitive and technological dynamics involved in the knowledge building educational model – improvable ideas; community knowledge; rise above; diversity of ideas;
democratizing knowledge; epistemic agency; knowledge-building discourse; concurrent assessment; symmetrical advances; constructive uses
of information; authentic problem; and pervasive knowledge building. The principles, which have attracted considerable research attention, are
intended to help researchers and teachers identify and examine knowledge building and to support the design of classroom work. While the set of
principles may seem to represent different socio-cognitive dynamics, Scardamalia argues that it constitutes a cohesive system of intertwining
principles, each providing a specific lens for examining knowledge building. She notes that “[the] 12 principles form a complex interactive system
of forces that drive the process” (Zhang, Scardamalia, Lamon, Messina, & Reeve, 2007, p.119), and “the interconnectedness of these ideas mean that
implementing one tends to unlock the others” (Scardamalia, 2002, p. 77). The principles depict collaboration as a notion that goes beyond the
division of labor and that involves students focusing on idea improvement and collective cognitive responsibility.
Over the years, empirical research has demonstrated the positive effects of the knowledge-building approach on literacy, depth of
inquiry, collaboration, and knowledge-creation processes, both among Western students (Scardamalia & Bereiter, 2006: Zhang, Scardamalia,
Reeve, & Messina, 2009) and students in other cultural contexts such as Chinese classrooms (van Aalst & Chan, 2007; Lee, Chan, & van Aalst,
2006). Currently, researchers conducting knowledge building studies in Hong Kong have developed a network of teachers implementing the
knowledge-building model and pedagogy in classrooms (Chan, in press).
While there has been good progress in developing knowledge-building theory, there are still many variations in how students engage in
knowledge-building collaboration and online discourse. In examining classroom design for collaborative inquiry, researchers have noted the
importance of examining students’ beliefs when developing appropriate classroom norms (Bielaczyc, 2006). Although there is much
research interest in knowledge building, relatively less has been known about how students themselves view their collaborative efforts as
being aligned with knowledge-building and how such views would predict their participation in Knowledge Forum. Prior research has used
a qualitative approach to examine how students assess their own collaborative knowledge building (van Aalst & Chan, 2007; Lee et al.,
2006). This study employs a quantitative approach to examine a large sample of students and their understanding of knowledge building.
We designed a questionnaire to capture knowledge building, adapted from the twelve principles to investigate students’ views on how they
collaborate, and to predict the influence of their views on their online participation.
2.2. Approach to learning
Another key theme of this study relates to students’ approach to learning, that is, their intentions, motives, and strategies as they go
about learning (Biggs, 1987; 1993; Marton & Booth, 1997). As this involves both motives-intentions (why) and use of strategies (how), we
consider that this construct encompasses both beliefs and approaches to learning. Three decades of research, encompassing studies conducted in different continents, have shown that students vary in their approaches to learning and that these differences influence their
academic achievement (Biggs, 1999; Marton & Säljö, 1976; Watkins, 1998).
Researchers from both qualitative and quantitative traditions have conceptualized that approaches to learning can be distinguished into
deep and surface approaches. For a surface approach, motive (intention) is basically extrinsic – students believe learning is about meeting
requirements and so they complete their task with minimum effort. Students adopting a surface strategy usually focus on surface features,
and tend not to see either connections among elements, or the broader meaning and implications of what is learned. In contrast, a deep
motive (intention) is related to the belief that learning is to understand with intrinsic motivation and commitment. Students who adopt
a deep strategy focus on relating ideas and linking new information to their previous knowledge and personal experience; they employ
meaning-making strategies to construct conceptual knowledge for deep understanding.
Student approaches to learning have become a major strand of research over the past few decades (Biggs, 1999; Entwistle & Ramsden,
1983; Marton & Booth, 1997). Substantial research evidence has accumulated that demonstrates the relevance of this construct for students
in both Western and Asian countries (Watkins, 1998). Many tertiary programs and research studies have used the questionnaires to assess
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
1447
the quality of students’ learning (Biggs, 1999; Prosser & Trigwell, 1999). Recently, the Learning Process Questionnaire (Biggs, 1987) has been
refined and now identifies eight dimensions of student learning: intrinsic interest, commitment to work, relating ideas, and understanding
characterize a deep approach; and fear of failure, aiming for qualifications, minimizing the scope of study, and memorization characterize
a surface approach (Kember, Biggs, & Leung, 2004). The refinement improves the scale’s psychometric properties, and facilitates its use to
examine teaching and learning in different curriculum and educational environments.
Despite the influence and impact of research on student approaches to learning, most studies have been conducted in regular classrooms.
Thus far, very few studies have examined students’ approaches to learning in computer-supported learning environments (for an exception, see
Ellis, Goodyear, Calvo, & Prosser, 2008). Learning approaches have been examined in relation to self-regulated learning (Cantwell & Moore,1996),
intellectual style (Zhang & Sternberg, 2006) and epistemological beliefs (Cano, 2005), yet few studies have examined approaches to learning in
relation to approaches to collaboration. Given recent changes in learning paradigms and the proliferation of web-based learning around the
world, it is important to examine how students’ approaches to learning, which are often individually based, relate to their approaches to
collaboration within a group or community, particularly when such collaboration is mediated by technology in an online learning environment.
2.3. Online participation in computer-supported collaborative learning
Although there is much enthusiasm for CSCL, examining and fostering online participation among students remains a key problem.
Research has shown that most online discussion threads are short and fragmented (Hewitt, 2005; Lipponen et al., 2003), and even
sophisticated knowledge-building environments have shown highly variable levels of participation. Stahl et al. (2006) point out that CSCL
environments are often used to exchange personal opinions and surface knowledge, rather than for collaborative knowledge building. These
findings on CSCL participation spark our interest in examining how student beliefs and approaches might influence their level of participation and engagement in CSCL.
We first review CSCL studies that examine the factors influencing student engagement and online participation. Researchers have
investigated various learner characteristics that influence forum participation, including gender, ability, social-cultural background, and
individual popularity (Prinsen, Volman, & Terwel, 2007). Studies of social network analysis have also investigated how gender and ability
influence online participation (Hakkarainen & Palonen, 2003). Many of these studies have focused on what Biggs (1999) calls presage
(background) factors such as gender and ability rather than cognitive factors that can be changed through instruction.
This study examines both background factors and student cognitive factors that influence online participation. Specifically, although an
increasing number of studies address student preferences and perceived learning outcomes in web-based learning, few directly examine
students’ beliefs and approaches in relation to actual measures and patterns in online participation. Some exceptions include studies that
have investigated how students’ self-efficacy influences online search strategies (Tu, Shih, & Tsai, 2008) and how student’s epistemic beliefs
are related to web-based behavior (Mason & Boldrin, 2010). The study reported here furthers this direction by investigating how students’
views of learning and collaboration predict their online forum participation.
A related theme pertains to the notion of online learner participation: Hrastinski (2009) argues that online participation is important for
understanding online learning, and expresses concerns about measuring online participation by merely counting the number of messages
posted on discussion boards. There are now major research strands of research on the detailed qualitative coding of online contributions. To
address our questions, and for a large sample, we also employed quantitative indices but we broadened the notion of online participation to
go beyond the total number of notes posted on the forum. We employed Analytic Toolkit (ATK), the analysis tool developed alongside with
Knowledge Forum (Burtis, 1998) at the University of Toronto, to measure online participation.
Analytic Toolkit uses server log information to generate different basic knowledge-building indices. These indices have been employed to
facilitate formative assessment of student participation on Knowledge Forum so that teachers can help students to improve their
engagement. In addition, researchers use the quantitative indices generated from Analytic Toolkit to provide a basic measure of knowledge
building to complement their qualitative analyses (Lai & Law, 2006; Lee et al., 2006; Niu & van Aalst, 2009). As different Analytic Toolkit
indices point to related aspects of participation, researchers have grouped these, using factor analysis, to provide more reliable measures
and for further analyses. These combined indices are correlated with the quality of questions and with explanations in Knowledge Forum
writing thus demonstrating good construct validity (van Aalst & Chan, 2007; Lee et al., 2006). The current study sought to conceptualize
online participation beyond the posting of notes to illustrate the notion that online participation involves students constructing meaning,
engaging in dialogue and community awareness.
2.4. Research goals and questions
This study integrates the three areas of collaborative knowledge building, approaches to learning, and CSCL participation to examine an
integrated conceptual model – students’ approaches to learning will influence how they engage in collaboration, which will, in turn,
influence their online participation. The present study investigates students’ beliefs and approaches to collaboration and learning in the
context of their working on Knowledge Forum. We use the terms “beliefs” and “approaches” to depict what students believe about the
nature of their collaboration and how they approach that collaboration. Approaches to learning include both beliefs involving motiveintentions of why they study and strategies depicting how they study in certain ways.
A questionnaire was first developed to examine students’ views of collaboration that aligns with the knowledge-building perspective. For
construct validation, individual differences in collaboration, learning approaches, and forum participation were investigated. Finally, we
investigated the relationships among these variables to test a causal model of how approaches to learning influence collaboration that in
turn predict forum participation. The research questions include:
(1) What are students’ views about their collaboration, and can they be identified from a knowledge-building perspective?
(2) What are the gender, grade, and ability differences in students’ views about collaboration, approaches to learning, and engagement in
online participation?
(3) How do students’ views about learning and collaboration predict their online participation in Knowledge Forum?
1448
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
3. Method
3.1. Participants
The sample includes 521 secondary school students in Forms One to Six (ages 12–17) from eight secondary schools in Hong Kong. These
participants were involved in a research project on computer-supported knowledge building. The sample includes 322 male and 199 female
students, with 216 from junior high (Grades 7–9, ages 12–14) and 305 from senior high schools (Grades 10–12, ages 15–17). Students in
Hong Kong are streamed into different bands according to their academic achievements; there were 267 students from high-band schools
and 254 students from low-band schools.
3.2. The research context
This study took place in the context of a University-School Partnership project on developing knowledge-building pedagogy for
elementary and secondary teachers in Hong Kong. The first author has had years of experience in conducting knowledge-building
classroom research in Asian classrooms; this project extended knowledge-building to groups of teachers in different schools (Chan,
in press). The context of the project included university researchers/mentors providing professional development to teachers. There
were regular workshops throughout the year to help teachers better understand knowledge-building epistemology and pedagogy;
groups of project teachers meeting to plan their curricula collectively; and classroom visits with university researchers and
teachers.
Regarding knowledge building pedagogy, in a typical knowledge-building classroom, students usually start by identifying areas of
inquiry and putting forth their ideas and questions, ‘making ideas public’ for collective improvement is emphasized (Scardamalia & Bereiter,
2006). In Asian classrooms, it is particularly important for students to experience working together as a community. In this project,
classroom and online discourse were integrated, with students contributing notes to Knowledge Forum as they engaged in collaborative
inquiry – posing questions, putting forth ideas and theories, building on others’ ideas, and co-constructing explanations to advance their
collective knowledge.
Data were obtained from twelve teachers teaching in eighteen classrooms. The selection criteria were that these teachers had to have
actively participated in the teacher professional development project, and to have demonstrated good attempts at incorporating knowledge
building pedagogy into their classroom. For example, they were included for having completed at least an 8–12-week curriculum unit on
knowledge building; those project teachers who had not spent reasonable time on knowledge-building were not included. It should be
noted that the knowledge building model involves complex dynamics with a continuum of classroom practice. Our research context,
consisting of different classrooms of teachers working on knowledge building, may be helpful for understanding variation in students’
beliefs about learning and collaboration. As noted above, in terms of duration, we included those teachers who had completed a curriculum
unit of about eight to twelve weeks. Although it is common for knowledge-building teachers to develop their practices over the whole year,
and even over multiple years (Zhang et al., 2009), knowledge-building teachers typically plan and implement curriculum units of specified
duration. For the purpose of research analyses, these durations have been reported in research studies on knowledge building in both
Canadian (van Aalst, 2009; Caswell & Bielaczyc, 2002) and Asian classrooms (van Aalst & Chan, 2007; So, Seah, & Toh-Heng, 2010).
3.3. Measures
Data were collected from two questionnaires examining students’ views of collaboration and online learning, and their preferred
approaches to learning. After examining the questionnaire data, we excluded items on online learning that showed variable responses, and
focused on the questionnaire items on knowledge-building and approaches to learning. We also employed students’ usage statistics on
Knowledge Forum derived from Analytic Toolkit to examine their online forum participation.
3.3.1. Questionnaire on collaboration
The questionnaire, comprising 12 items, written in Chinese, examined students’ views of collaboration aligned with the notion of
knowledge building (Scardamalia & Bereiter, 2006). Students were asked to use a 5-point Likert scale to rate the questionnaire items that
reflected their experience of collaboration while working on knowledge building. In assessing these items, the students could refer to both
face-to-face and online collaboration.
Scardamalia has developed a set of 12 principles and we adapted from these principles as indicators of knowledge building. Theoretically,
as discussed in the Introduction, although these principles may represent different socio-cognitive dynamics, Scardamalia (2002) has
emphasized that these 12 principles work together as a complex system, each principle providing a different lens. Researchers and teachers
are well aware that the principles often overlap, and Scardamalia has suggested that one principle may unlock others. Over the years, varied
efforts have been made to group this large set of principles, few research analyses have produced reliable clusters or subsystems of principles. It is plausible that the principles work together well as a highly cohesive system.
Our goal, therefore, was not to focus on measuring or classifying the principles, but to tap into some overall construct of knowledge
building. Rather than using one item to measure one principle, we aimed to see how different items adapted from the principles may
culminate to reflect the notion of collaboration aligned with knowledge building. We did adapt from the entire set of principles so we would
not miss out particular aspects of knowledge building.
On practical and empirical grounds, when designing the questionnaire, teachers were asked to check these items for face validity and
ease of administration, and the wording of the items was simplified to make them comprehensible to high-school students. Although it may
be preferable to have more items, this shorter version was adopted so that the questionnaire would not be too onerous for students. To
iterate, our focus was on examining some construct of collaboration that was aligned with notion of knowledge building focusing on idea
improvement and collective knowledge growth.
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
1449
3.3.2. Questionnaire on approach to learning
Approaches to learning have been studied for over two decades. This study employed the Revised Learning Process Questionnaire
(Kember et al., 2004) to assess students’ approaches to learning. The questionnaire consists of 22 items that identify deep and surface
approaches to learning, each of which can be further classified into four dimensions. Some examples of deep approach questionnaire items
include “I feel that nearly any topic can be highly interesting once I get into it” and “I like constructing theories to fit odd things together”;
surface approach questionnaire items include “I see no point in learning material which is not likely to be in the examination” and “I find the
best way to pass examinations is to try to remember answers to likely questions.” The Chinese version of the questionnaire has been shown
to have good psychometric properties.
3.3.3. Online participation in Knowledge Forum
To measure students’ online forum participation, Analytic Toolkit was used to retrieve and analyze summary statistics on individual
students’ activity in Knowledge Forum. Analytic Toolkit Version 4.6 provides up to 27 analyses to show how students interact with each
other in the Knowledge Forum database. We selected several of the most frequently employed indices from previous studies, including
those that have been grouped into overall indices with good construct validity with quality of forum writing (e.g., van Aalst & Chan, 2007;
Lee et al., 2006; Niu & van Aalst, 2009). The indices are as follows:
(i) Number of notes written: This is included because it is the most commonly used index for measuring online participation.
(ii) Scaffolds: This index refers to the number of scaffolds (thinking prompts) used. Knowledge Forum includes scaffolds such as “I need to
understand”, “a better theory”, and “putting our knowledge together”. Scaffolds help students to frame ideas and to signpost their ideas
to others for interaction and dialogue.
(iii) Revision: Students’ attempts to revise their notes are recorded. From a knowledge-building perspective, revision shows a deeper
approach to working with ideas. Instead of employing a linear approach, ideas are revisited and revised based on the contributions of
the community.
(iv) Number of notes read: The number of notes read has been considered important for assessing community awareness; one cannot
engage in dialogue without knowing what others have written (Zhang et al., 2009).
(v) Number of build-on notes: This index is different from the number of posted notes, and refers to responses to previous notes. This index
provides more information about interaction among participants.
(vi) Keywords: Students can include “keywords” when they write notes on Knowledge Forum. Other participants can use these keywords
to search for related notes on similar topics. The use of keywords reflects domain knowledge and community awareness as students try
to make their work more accessible to other members.
4. Results
4.1. Approach, procedure and criteria
The various constructs were first examined using exploratory factor analyses, confirmatory factor analyses and scale reliabilities, after
which individual differences were tested to provide aspects of construct validity of these measures. Finally, we conducted SEM to test an
overall model including both measurement and structural relationships among the variables – we tested the model that students’ preferences for learning approaches would influence their collaboration, which would then predict students’ online participation.
Exploratory factor analyses (EFA), confirmatory factor analyses (CFA) and Cronbach alphas were conducted to examine the fit, validity,
and reliability of the measurement models. Researchers discussed various indices and we included information on the criteria employed. For
CFA of knowledge-building, learning approaches and online participation, we followed Kember et al. (2004) to allow for comparison
between our measurements and the original scale of learning approach, and used the 2-index strategy (Hu & Bentler, 1999) to evaluate
Comparative Fit Index (CFI) and Standardized Root Mean Square Residual (SRMR). This strategy has been shown to control for both Type I
and Type II errors. Researchers have indicated that CFI > .9 (McDonald & Ho, 2002; p.72) and SRMR < .08 (Hu & Bentler, 1999; p.27) indicate
an acceptable fit. In addition, we have also included several commonly-used goodness of fit indices.
As SEM of the overall hypothesized model is critical, we included additional indicators with these criteria: (1) Bentler Comparative Fit
Index (CFI), with values greater than .9 indicating an acceptable fit (McDonald & Ho, 2002), (2) Jöreskog-Sörbom Good of Fit Index (GFI), with
values greater than .9 indicating an acceptable fit (McDonald & Ho, 2002), (3) Tucker-Lewis Index (TLI), with values greater than .9 indicating
acceptable fit (McDonald & Ho, 2002), (4) Root Mean Square Residual (RMR), with values lower than .05 indicating an acceptable fit
(McDonald & Ho, 2002), Root Mean Square Error of Approximation (RMSEA), with values of .06 or less indicating a good fit (Hu & Bentler,
1999); (5) Standardized Root Mean Square Residual (SRMR), with values of .08 or less as acceptable (Hu & Bentler, 1999). While there are
concerns with chi-square values due to the sample size, we have included here for information: (6) Chi-square (c2) statistics and degrees of
freedom (df), with a good fit being a ratio between c2 and df of less than 3 for c2/df (Chen, 2008; Kline, 1998). This combination of reported
fit indices was chosen so as to include both absolute and incremental fit measures (Hair, Black, Babin, Anderson, & Tatham, 2006; Tabachnik
& Fidell, 2007). Both CFA and SEM were conducted using the software program AMOS 16. Missing data have been replaced by the mean
scores of the corresponding variables.
4.2. Factor analyses and scale reliability
4.2.1. Collaborative knowledge building
Exploratory factor analysis of the 12 questionnaire items indicated one factor accounting for 37.49% of the variances. This factor is called
“Collaborative Knowledge Building” and reflects students’ views about their collaboration adapted from a set of knowledge-building
principles. As the study’s goal is to test the theoretically-guided one-factor model, CFA was employed to obtain more information, yielding
1450
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
the following goodness of fit indices: CFI ¼ .931; GFI ¼ .951; RMR ¼ .047 and SRMR ¼ .047, suggesting an acceptable model fit. Cronbach’s
alpha of collaborative knowledge building for the 12 items is .85, indicating good scale reliability (Table 1).
4.2.2. Approach to learning
Exploratory factor analyses of the 22 items indicate that the questionnaire items are loaded onto the two expected factors of Deep and
Surface approaches (Table 2). These two factors explain 19.04% and 12.54% of the total variances, respectively. Cronbach’s alphas for Deep
and Surface scales are .83 and .70, respectively. We further employed CFA and followed the approach used by Kember et al. (2004) for testing
a more refined model. The Deep Approach model tested consists of four factors, including (a) intrinsic interest, (b) commitment to work, (c)
relating ideas, and (d) understanding; and the four factors of Surface Approach are (a) fear of failure, (b) aim for qualification, (c) minimizing
scope of study, and (d) memorization. We obtained the following indices: CFI ¼ .956, GFI ¼ .953, RMR ¼ .049 and SRMR ¼ .040, comparable
to those obtained in Kember et al. (2004), CFI ¼ .968 and SRMR ¼ .056. The Cronbach alphas for the four deep approach subscales range from
.47 to .66, and the four surface approach subscales range from .45 to .64. Although the reliabilities of some subscales are relatively low, all are
comparable to the data obtained by Kember et al. (2004) and other researchers. Phan (2007, p. 726), for example, reports similar results and
notes that subscale reliabilities are influenced by the number of items and the multi-dimensionality of the model.
4.2.3. Online forum participation
Analytic Toolkit was used to generate basic knowledge-building indices for all 521 students before transferring the data for analyses
using SPSS. To control for varied forum participation in different classrooms, the scores were converted into standardized scores for
analyses. Subsequent analyses showed similar patterns using raw scores and standardized scores.
Previous studies using factor analyses of Analytic Toolkit (ATK) knowledge-building indices have shown that these indices load on one
common factor (van Aalst & Chan, 2007; Niu & van Aalst, 2009), while other studies have shown a two-factor solution (Lee et al., 2006). On
theoretical grounds, we used CFA to test whether online participation can be examined in terms of (a) meaning construction and (b)
community awareness (Table 3). Testing confirmed our expectation – notes written, scaffolds and revision forum indices loaded on onefactor we called ‘Meaning Construction’; the other three forum indices – notes read, build-on notes, and use of keyword, load onto another
factor we called Community Awareness. CFA analyses provided goodness of fit indices of CFI ¼ .967; GFI ¼ .950; RMR ¼ .044 and
SRMR ¼ .026. Cronbach alphas of Meaning Construction and Community Awareness are .83 and .72 respectively.
4.3. Differences in collaboration, approach to learning, and online participation
Analyses were conducted to examine whether there were differences for grade, gender and achievement for collaborative knowledge
building, approaches to learning and online participation. The idea was to examine whether these identified constructs showed reasonable
patterns with regard to students’ background characteristics. Students were grouped according to gender (male and female), grade (junior
secondary and senior secondary), and achievement levels (low band and high-band).
4.3.1. Group differences in collaborative knowledge building
Table 4 shows the mean scores of Collaborative Knowledge Building across groups. A three-way MANOVA (gender grade achievement)
on students’ scores for Collaborative Knowledge Building was conducted. Analyses show significant differences in achievement levels in
Collaborative Knowledge Building (F (6, 499) ¼ 11.79, p < .001, h2 ¼ .023), with high-achieving students (M ¼ 3.59) obtaining higher scores
than low-achieving students (M ¼ 3.37). Significant main effects for gender are also shown (F (6, 499) ¼ 4.36, p < .05, h2 ¼ .009), with female
students obtaining higher scores (M ¼ 3.50) than male students (M ¼ 3.48). There are no differences for grade level, and interaction effects are
not significant. Table 5 provides a summary of differences in these variables as a function of grade, gender and achievement.
4.3.2. Group differences in approach to learning
Table 4 also shows the mean scores for Deep Approach and Surface Approach across groups. A three-way MANOVA
(gender grade achievement) analysis indicate marginally significant main effects of achievement level on a Deep Approach (F(6,
499) ¼ 3.62, p < .06, h2 ¼ .007), favoring students with high achievement levels (M ¼ 3.12) over those with low achievement levels (M ¼ 2.95).
Significant main effects of gender in Surface Approach (F(6, 499) ¼ 5.49, p < .05, h2 ¼ .011) indicate that more male students employed a surface
approach (mean ¼ 2.95) than did female students (mean ¼ 2.89). There are no significant grade differences or interaction effects (see Table 5).
Table 1
Factor loadings and cronbach alpha for collaborative knowledge building.
Factor loading
Collaborative knowledge building
CKB1 We work on improving our ideas continually during the process of inquiry
CKB2 Our views and knowledge broaden through working with others.
CKB3 Ideas from different members are synthesized into new knowledge.
CKB4 Class members pose different ideas with diverse perspectives.
CKB5 Ideas from different class members are equally valuable.
CKB6 Goal setting and planning for our progress is important.
CKB7 Ideas are exchanged to improve our knowledge.
CKB8 We reflect on and assess the progress of our understanding continually.
CKB9 Different groups can benefit each other and make progress together.
CKB10 Different sources of reference information are examined for building knowledge.
CKB11 The knowledge we work on is relevant to real-life problems.
CKB12 Our ideas and knowledge are relevant within and outside the school context.
Cronbach alpha
.85
.55
.63
.59
.49
.47
.53
.60
.48
.57
.62
.59
.57
1451
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
Table 2
Factor loadings and cronbach alphas for approaches to learning.
Factor loading
Cronbach
Alpha
.83
.65
Deep Approach (Items [ 11)
Intrinsic Interest (Items ¼ 3)
D1 I find that at times studying makes me feel really happy and satisfied.
D2 I feel that nearly any topic can be highly interesting once I get into it.
D3 I work hard at my studies because I find the material interesting.
Commitment to Work (Items ¼ 4)
D4 I spend a lot of my free time finding out more about interesting topics which have been discussed in different classes.
D5 I come to most classes with questions in mind that I want answering.
D6 I find I am continually going over my school work in my mind at times like when I am on the bus,
walking or lying in bed, and so on.
D7 I like to do enough work on a topic so that I can form my own conclusions before I am satisfied.
Relating Ideas (Items ¼ 2)
D8 I try to relate what I have learned in one subject to what I learn in other subjects.
D9 I like constructing theories to fit things together.
Understanding (Items ¼ 2)
D10 I try to relate new material, as I am reading it, to what I already know on that topic.
D11 When I read a textbook, I try to understand what the author means.
Surface Approach (Items [ 11)
Fear of Failure (Items ¼ 2)
S1 I am discouraged by a poor mark on a test and worry about how I will do on the next test.
S2 Even when I have studied hard for a test, I worry that I may not be able to do well in it.
Aim for Qualification (Items ¼ 2)
S3 Whether I like it or not, I can see that doing well in school is a good way to get a welk-paid job.
S4 I intend to get my A Levels because I feel that I will then be able to get a better job.
Minimizing Scope of Study (Items ¼ 4)
S5 I see no point in learning material which is not likely to be in the examination.
S6 As long as I feel I am doing enough to pass, I devote as little time to studying as I can. There are many more
interesting things to do.
S7 I generally restrict my study to what is specifically set as I think it is unnecessary to do anything extra.
S8 I find it is not helpful to study topics in depth. You don’t really need to know much in order to get by in most topics.
Memorization (Items ¼ 3)
S9 I learn some things by rote, going over and over them until I know them by heart.
S10 I find the best ways to pass examinations is to try to remember answers to likely questions.
S11 I find I can get by in most assessments by memorizing key sections rather than trying to understand them.
.75
.77
.78
.66
.74
.66
.70
.77
.59
.84
.84
.47
.80
.81
.70
.64
.85
.86
.54
.85
.80
.63
.68
.67
.69
.71
.45
.70
.72
.65
4.3.3. Group differences in online participation
Fig. 1 shows the mean frequency distribution of forum participation indices across groups. As discussed above, the Analytic Toolkit (ATK)
knowledge-building indices are grouped into two overall indices – Meaning Construction and Community Awareness. A three-way MANOVA
(gender grade achievement) on Meaning Construction shows significant main effects of achievement, with high-achieving students
obtaining higher scores than low-achieving students (F(6, 496) ¼ 5.96, p < .05, h2 ¼ .014), significant grade effects (F (6, 496) ¼ 11.54, p < .001,
h2 ¼ .026), with higher scores for senior students than for junior students, and significant main effects of gender (F (6,496) ¼ 5.32, p < .05,
h2 ¼ .012), with male students obtaining higher scores than female students. In addition, gender by grade effects (F (6, 499) ¼ 6.16, p < .05,
h2 ¼ .014) indicate that male students obtained higher scores than female students among junior but not senior students.
A three-way MANOVA (gender grade achievement) on Community Awareness indicates significant effects of achievement, with
high-achieving students obtaining higher scores than low-achieving students (F(6, 496) ¼ 4.27, p < .05, h2 ¼ .010). Significant main effects of
gender (F (6,496) ¼ 4.58, p < .05, h2 ¼ .010), with male students obtaining higher scores than female students, and similar grade by gender
interaction effects were also obtained.
A three-way MANOVA (grade gender achievement) was then conducted on the individual online forum participation indices. Table 6
summarizes the differences for different forum participation indices as a function of grade, gender, and achievement.
Taken together, these results indicate that high-achieving students reported more sophisticated beliefs and approaches and, as with the
older students, had higher forum participation. Female students reported higher scores on beliefs and approaches, but gender effects on
online forum participation varied across grades for different indices.
Table 3
Factor loadings and cronbach alphas of analytic toolkit indices for online participation.
Online Forum Participation
Factor 1 - Meaning construction
Number of notes written
Number of scaffolds
Number of revisions
Factor 2 - Community awareness
Number of notes read
Number of build-on
Number of keywords
Factor loading
Cronbach alpha
.83
.92
.85
.87
.72
.73
.89
.88
1452
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
Table 4
Mean Scores (SD in parenthesis) of collaborative knowledge building, deep approach and surface approach for grade, gender and achievement groups.
High-Achieving (n ¼ 267)
Younger students (Ages 12–14)
Knowledge building
Deep approach
Surface approach
Older students (Ages 15–17)
Knowledge building
Deep approach
Surface approach
Low-Achieving (n ¼ 254)
Male
Female
Male
Female
3.59 (.59)
3.19 (.65)
2.99 (.61)
–
–
–
3.29 (.55)
2.94 (.55)
2.96 (.48)
3.45 (.56)
2.97 (.62)
2.79 (.58)
3.60 (.65)
3.14 (.65)
2.86 (.58)
3.60 (.49)
2.87 (.54)
2.74 (.56)
3.10 (.65)
2.88 (.60)
3.07 (.44)
3.49 (.55)
2.96 (.53)
2.88 (.49)
4.4. Relations among collaboration, learning approach and online forum participation
4.4.1. Correlation analyses
We first examined correlations among different measures including Deep and Surface Approaches, Collaborative Knowledge Building
and Online Participation (i.e., Meaning Construction and Community Awareness) (Table 7). The results indicate that Collaborative
Knowledge Building correlates significantly with Deep Approach (r ¼ .65, p < .01), Online Meaning Construction (r ¼ .24, p < .01) and Online
Community Awareness (r ¼ .19, p < .01). Deep Approach also correlates significantly with Online Community Awareness (r ¼ .18, p < .01) and
Online Meaning Construction (r ¼ .16, p < .01). Correlations between Surface Approach and the two indices of online participation are not
significant.
4.4.2. Structural equation modeling
SEM was conducted to provide a coherent picture, and to test a model based on the hypothesis that students’ preferences for learning
approaches would influence their collaborative knowledge building, which would then exert effects on their online participation. The model
tested included four latent variables, Deep Approach, Surface Approach, Collaborative Knowledge Building and Online Forum Participation
with indicators and aggregate scores derived from the measurement models reported earlier. The hypothesized SEM model, with standardized solutions of paths, is depicted in Fig. 2. Significant paths indicated that Deep Approach predicts Collaborative Knowledge Building,
which, in turn, exerts a significant effect on Online Participation. Testing the model yields the following indices (c2 ¼ 376.71, df ¼ 213, c2/
df ¼ 1.769, p < .001; CFI ¼ .950; GFI ¼ .938; TLI ¼ .946; RMR ¼ .045; RMSEA ¼ .038; SRMR ¼ .042) indicating a good fit. Table 8 summarizes
the indices of the structural model, comparing them to common acceptable criteria. We also compared and tested another model by
including a direct path from Deep Approach to Online Forum Participation– SEM analysis results indicated that the goodness of fit indices
were less satisfactory (c2 ¼ 811.4, df ¼ 213, c2/df ¼ 3.809, p < .001; CFI ¼ .819; GFI ¼ .879; TLI ¼ .806; RMR ¼ .153; RMSEA ¼ .073;
SRMR ¼ .151). These results support the hypothesized relationship: Deep Approach predicts Collaborative Knowledge Building which in turn
predicts Online Forum participation. Collaborative Knowledge Building exerts a direct effect on Online Participation and mediates the effects
of Deep Approach on Online Participation.
5. Discussion
The study reported in this paper investigates students’ views of their collaboration and learning in relation to their online participation in
Knowledge Forum. Although online discussion was commonplace, there were widespread problems involving fragmented discussion
among students. Many students may not have understood that the goal of online discussion is knowledge building; instead, they may have
regarded it as a conversational exchange or a sharing of information. Our key results indicate that those students who viewed their
collaboration as more aligned with knowledge building were more engaged in online participation in Knowledge Forum.
5.1. Students’ views on collaboration
Despite the increased level of interest in students’ conceptions of and attitudes towards web-based learning, relatively little has been
known about students’ views of their collaboration when they work in an online forum. The questionnaire adapted from a set of knowledgebuilding principles (Scardamalia, 2002) developed in this study tapped into students’ views of how they collaborate when working on
knowledge-building curriculum. Based on the current research literature on knowledge building, we hypothesized a one-factor model.
Confirmatory factor analysis and good scale reliability suggest that students have some coherent ideas about collaboration that are aligned
Table 5
Differences across Groups (F-Values) for collaborative knowledge building, deep approach and surface approach as a function of gender, grade and achievement.
Gender
Grade
Achievement
Gender by Grade
Grade by Achievement
Gender by Achievement
Notes:
a
significant at the .01 level (2-tailed);
b
Collaborative Knowledge Building
Deep Approach
Surface Approach
4.36b
.04
11.79a
.87
.84
2.49
.27
.14
3.62
.07
.00
1.03
5.49b
.01
1.22
.01
1.3
.087
significant at the .05 level (2-tailed).
1453
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
90
80
70
Num ber of Ti m es
Younger
Older
60
50
40
30
20
10
0
Write
Scaffold
Revision
Read
Build-On
Keyword
Build-On
Keyword
Online Forum Participation indices
90
80
70
Num ber of Ti m es
Low-Achieving
60
High-Achieving
50
40
30
20
10
0
Write
Scaffold
Revision
Read
Online Forum Participation indices
100
90
80
Male
Num ber of Ti m es
70
Female
60
50
40
30
20
10
0
Write
Scaffold
Revision
Read
Build-On
Keyword
Online Forum Participation indices
Fig. 1. Mean frequency distribution of forum participation indices across groups: by age (top), by achievement (middle), and by gender (bottom).
with knowledge-building. Students who scored higher on this scale view collaboration more as working on idea improvement and
knowledge advances in the community.
Our findings also indicate that students who prefer a deep approach to learning are more inclined to engage in knowledge-building
collaboration. This result is consistent with the findings that students who adopt a deep approach ascribe a higher value to collaboration
(Goodyear, Jones, Asensio, Hodgson, & Steeples, 2005). There are no correlations found between a surface approach and knowledge-building
collaboration: students who prefer to expend minimal effort do not see themselves as being involved in knowledge-building collaboration.
Furthermore, analyses of individual differences indicate that higher-achieving students obtain higher scores on deep approach and
collaborative knowledge building. This provides further evidence of construct validity, and is consistent with the literature showing that
more mature students have more sophisticated beliefs and learning strategies.
While approaches to learning have been examined widely in regular classrooms, this study is one of the few to have investigated the
relationship between learning approaches and collaboration among students engaged in computer-supported collaborative learning. The
findings show the congruence between individually-based deep approaches and community-based knowledge-building – students who
1454
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
Table 6
Differences across groups (F-Values) for online participation indices as a funtion of gender, grade, and achievement.
Gender
Grade
Achievement
Gender by grade
Grade by achievement
Gender by achievement
Write
Scaffold
Revise
Read
Build-on
Key-word
Meaning Construction
Community Awareness
3.16
.52
1.36
4.87c,d
1.72
2.67
9.08b
34.7a
31.6
5.93c,d
1.56
3.24
2.15
8.3b
.12
3.7
1.02
2.43
5.09c
19.5a
.19
1.98
1.91
2.14
3.31
.49
3.63c
3.88c,d
.30
1.94
1.32
8.26a
17.36c
1.65
11.9d,e
.77
5.32c
11.54b
5.96c
6.16b,d
.84
3.5
4.58c
1.41
4.27c
3.66c,d
.66
2.35
Notes:a significant at the .001 level (2-tailed); b significant at the .01 level (2-tailed);c significant at the .05 level (2-tailed).
d
significant differences obtained between male and female students in lower forms but not higher forms.
e
significant differences obtained between high- and low-achievers in lower forms but not higher forms.
employ a deep approach also favor working collaboratively with others. Students’ commitment to adopting a deep approach to learning is
not just a matter of how they work individually, but can also be related to their shared goals for collaborative knowledge building mediated
by technology. For example, a deep strategy for relating ideas is important for idea improvement, which requires dialogic interaction rather
than simply posting information in an online discourse.
5.2. Learning, collaboration, and online participation
Psychological studies have examined the relation between beliefs and approaches with academic learning outcomes; computer-based
studies have investigated students’ conceptions of and attitudes towards web-based learning. Our results indicate that preferences for
a deep approach predict collaborative knowledge building, which in turn exerts an effect on online participation in Knowledge Forum. This
study is one of the few to show that online participation in discussion forums is linked directly to students’ beliefs of their collaboration. The
results further support the idea that effective web-based instruction is related to student conceptions. These findings are consistent with the
idea that online forum participation involves metacognitive and epistemological dimensions (Tsai, 2009).
These findings provide further evidence supporting the knowledge-building framework for productive discourse (Scardamalia &
Bereiter, 2006). While the majority of prior studies have used qualitative analysis and classroom research, this study has employed selfreported data from a large sample of students on their beliefs and approaches to collaboration. The results are consistent with prior
qualitative analyses showing that students with higher forum participation exhibit such knowledge-building characteristics as epistemic
agency and improvable ideas. For example, van Aalst and Chan (2007) show that students who employ collective agency via rise-above
assessment are more engaged in forum participation with more advances in domain knowledge.
This study has also examined the relation between approach to learning, collaborative knowledge building and online participation. SEM
analyses indicate that deep approaches predict collaborative knowledge building, but have no direct effects on online participation. In other
words, to help students with online participation, it is useful for them to have preferences for deep approaches. However, these conditions
alone are not sufficient, and it is important that students are supported in developing more sophisticated approaches to collaboration. The
key implication is that, if students are to develop more productive online participation patterns, they need to understand and engage in deep
collaboration. The scale adapted from the knowledge-building perspective may provide some pointers as to how teachers can assess student
progress, and how students can use these items to assess their own progress in collaboration.
This study has also developed some ways to assess online participation using a broader range of indices. Concerns have been raised about
merely using the number of notes posted on the Web as the indicator of online participation (Hrastinski, 2008). Online learning can be
examined as online participation through students actively constructing meaning and contributing to others’ understanding in the
community. The Analytic Toolkit indices employed in this study had been validated previously in studies on knowledge building in
classrooms, using small samples (van Aalst & Chan, 2007); this study has applied these indices to a large group of students. CFA and SEM
analyses show that these indices can be grouped meaningfully. Our indices go beyond posting and receiving materials on the Web; they
reflect a deeper notion of online participation involving meaning construction, learner dialogue and community awareness.
Apparently, qualitative analyses may provide a more complete picture; however, quantitative indices are useful for providing an
overview for comparison (Guzdial & Turns, 2000) and conducting a formative assessment (Chan & van Aalst, 2004). While the Analytic
Toolkit indices are designed for use with Knowledge Forum, they need not be considered as measures that are specific to this computer
platform and can be adapted for use with other platforms. The key implication is that some possibilities are suggested for examining online
participation in more varied ways.
5.3. Limitations and further research
As the results reported here on collaboration and learning are based on students’ self-reported data, there is a possibility of reporting
biases such as social desirability; hence, they should be interpreted with caution. Some concerns may be raised regarding the relatively low
reliability of some subscales of approach to learning; we have therefore included information from previous studies reporting similar
Table 7
Zero-Order correlation among collaborative knowledge building, deep approach, surface approach and online participation (n ¼ 521).
Deep approach
Surface approach
Collaborative knowledge building
Online meaning construction
Notes:
a
Surface approach
Collaborative knowledge building
Online meaning construction
Online community awareness
.10b
.65a
.06
.18a
-.03
.24a
.16a
.01
.19a
.86a
significant at the .01 level (2-tailed);
b
significant at the .05 level (2-tailed).
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
1455
Fig. 2. A structural equation model of approaches to learning, collaborative knowledge building and online forum participation.
patterns. There may be questions about the selection and grouping of online forum participation indices; we indicate that those indices have
been employed in previous studies. It is important to note that, while we employed quantitative analyses of forum participation, content
analyses of students’ writing on Knowledge Forum are important for examining the quality of students’ knowledge building. Future research
is needed to investigate if students’ beliefs about learning and collaboration are correlated with quality of student discourse and writing
based on content analyses.
The questionnaire examining collaborative knowledge building consists of twelve items adapted from the principles, and, taken together,
these items may tap into some underlying notion of knowledge building. We recognize that some items with wording simplified for school
students may gloss over the more intricate dimensions of knowledge building. Although the questionnaire items do not capture the full
richness of knowledge building as in qualitative analyses, they provide a usable set of items for examining collaboration that illustrates
aspects of knowledge building. We also hope that they can be used more widely in classrooms to help teachers to understand more about
their students’ beliefs and approaches. The measurement and structural model provides some preliminary support, and the quantitative
findings complement current research on qualitative analyses. Further construct validation studies and qualitative analyses need to be
undertaken to improve the scale and to tap the complex dimensions of knowledge building.
The questionnaires were collected from students working on knowledge building in a university-school partnership project; the
classrooms may vary considerably in terms of how well the teachers practice knowledge building. Apparently, teachers’ curriculum designs,
pedagogical approaches and epistemological beliefs influence students’ beliefs about and approaches to collaboration. However, studies on
university students’ web-based learning beliefs and preferences may also be influenced by a variety of subject areas and teachers’ pedagogical beliefs and approaches. We acknowledged teachers’ influence but, as teacher factor is not a major theme of our study, it has not been
included as a variable in our analyses. Nevertheless, in order to tease out the possible impacts of teacher differences, we conducted
additional analyses, using Hierarchical Linear Modeling (HLM), to examine student beliefs and teacher effects as predictors. We classified
Table 8
Goodness-of-fit indices for the SEM model and comparison with suggested criteria.
Fit indices
Values
Suggested values for model Fit
c2/df
1.769
.950
.938
.946
.045
.038
.042
<
>
>
>
<
<
<
CFI
GFI
TLI
RMR
RMSEA
SRMR
References:
a
Kline, 1998;
b
McDonald & Ho, 2002; c Hu & Bentler, 1999.
3a
.9b
.9b
.9b
.05b
.06c
.08 as acceptable; < .05 as good fitc
1456
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
teachers into groups with different degrees of experience in conducting knowledge building. Our analyses indicate that student beliefs of
collaborative knowledge building was a significant predictor of online forum participation; adding teacher experience as a variable did not
improve the prediction. It appears that the measure of teacher experience may have been crude, and deeper analyses may show teacher
effects. Nevertheless, such analyses suggest that even when there are differences in teacher experience and practice, student beliefs is
a predictor of online participation that needs to be examined. Future research is needed to examine further the relation between teacher
epistemology and student beliefs and their effects on online forum participation.
This study employed the terms “beliefs” and “approaches” in the general sense of what students believe about why and how they learn;
and what students believe about the nature of and how they approach collaboration when working on computer-supported knowledge
building. We have not investigated deep epistemological aspects of learning, such as whether knowledge is certain or uncertain, nor have we
examined whether students view collaboration as knowledge sharing, knowledge construction or knowledge creation (van Aalst, 2009).
These are important conceptual and epistemological dimensions that would benefit from examination in future studies, and which may
provide further insights into students’ understanding of collaboration in online environments.
6. Conclusions
With the widespread introduction of computer-supported environments, it is timely and important to understand students’ views of
how they collaborate in light of new ways of knowledge interaction mediated by technology. The study reported in this paper makes
contributions in various areas.
First, the questionnaire developed for the study helps to depict a deep form of collaboration aligned with collective knowledge building.
The questionnaire items can be used as pointers in assessing student collaboration, and also as indicators to help students work towards
more sophisticated practice. While the focus of this study is on collaborative knowledge building, its broader implication is to encourage
researchers and educators to examine how students view the ways they are involved in collaboration in CSCL learning environments.
Second, our study is one of the few to have established a direct link between student beliefs and approaches to learning and online forum
participation. Using a large sample, we provide preliminary evidence indicating how students’ preferred approaches to learning (presage)
influence their collaborative engagement (process), which impacts on their participation patterns (product) in online forums. The implication is that researchers need to consider student cognitive factors in designing and evaluating computer-based instruction, and that
educators need to focus on students’ beliefs in helping them to move towards more productive online participation. Third, we have
developed ways of examining online participation using more varied quantitative indices. Further examination of these indices may enrich
our understanding of online participation as online learning that involves learners’ meaning construction and community connectedness.
This study has shown the importance of considering students’ views of collaboration and learning when designing computer-supported
learning. While the study employs constructs from psychological studies, it also suggests the contribution of examining students’ beliefs and
approaches in a computer-supported environment. In CSCL environments, students can work in a multimedia-rich community knowledge
space to refine and create ideas, and to contribute theories, explanations, and working models for knowledge creation (Scardamalia &
Bereiter, 2006). Investigating students’ beliefs of collaboration in CSCL environments may raise new questions and shed light on how
theories examining online learning as socio-metacognitive and epistemological processes can be advanced.
References
van Aalst, J. (2009). Distinguishing knowledge-sharing, knowledge-construction, and knowledge-creation discourse. International Journal of Computer-Supported Collaborative
Learning, 4(3), 259–287.
van Aalst, J., & Chan, C. K. K. (2007). Student-directed assessment of knowledge building using electronics portfolios. Journal of the Learning Sciences, 16(2), 175–220.
Bereiter, C. (2002). Education and mind in the knowledge age. Mahwah, NJ: Lawrence Erlbaum Associates.
Bielaczyc, K. (2006). Designing social infrastructure: critical issues in creating learning environments with technology. Journal of the Learning Sciences, 15(3), 301–329.
Biggs, J. (1987). Student approaches to learning and studying. Melbourne: Australian Council for Educational Research.
Biggs, J. (1993). What do inventories of learning really measure? A theoretical review and clarification. British Journal of Educational Psychology, 63(1), 1–19.
Biggs, J. (1999). Teaching for quality learning at university: What the student does. Buckingham, UK: Open University Press.
Burtis, J. (1998). Analytic Toolkit for knowledge forum. The Ontario Institute for studies in education. Toronto, Canada: University of Toronto.
Cano, F. (2005). Epistemological beliefs and approaches to learning: their change through secondary school and their influence on academic performance. British Journal of
Educational Psychology, 75(2), 203–221.
Cantwell, R. H., & Moore, P. J. (1996). The development of measures of individual differences in self-regulatory control and their relationship to academic performance.
Contemporary Educational Psychology, 21(4), 500–517.
Caswell, B., & Bielaczyc, K. (2002). Knowledge Forum: altering the relationship between students and scientific knowledge. Education, Communication & Information, 1(3),
281–305.
Chan, C. K. K. (in press). CSCL theory-research-practice synergy: Implementing knowledge building in Hong Kong classrooms. International Journal of Computer-Supported
Collaborative Learning.
Chan, C. K. K., & van Aalst, J. (2008). Collaborative inquiry and knowledge building in networked multimedia environments. In J. Voogt, & G. Knezek (Eds.), International
handbook of information technology in primary and secondary education (pp. 299–316). New York: Springer.
Chan, C. K. K., & van Aalst, J. (2004). Learning, assessment, and collaboration in computer-supported collaborative learning. In J. W. Strijbos, P. A. Kirschner, & R. L. Martens
(Eds.), What we know about CSCL and implementing it in higher education (pp. 87–112). Boston, MA: Kluwer Academic Publishers.
Chen, Y. L. (2008). Modeling the determinants of internet use. Computers & Education, 51(2), 545–558.
Dillon, A. (2000). Designing a better learning environment with the web: problems and prospects. Cyberpsychology and Behavior, 3(1), 97–101.
Entwistle, N., & Ramsden, P. (1983). Understanding student learning. London: Croom Helm.
Ellis, R. A., Goodyear, P., Calvo, R. A., & Prosser, M. (2008). Engineering students’ conceptions of and approaches to learning through discussion in face-to-face and online
contexts. Learning and Instruction, 18(3), 267–282.
Goodyear, P., Jones, C., Asensio, M., Hodgson, V., & Steeples, C. (2005). Networked learning in higher education: student expectation and experiences. Higher Education, 50(3),
473–508.
Guzdial, M., & Turns, J. (2000). Computer-supported collaborative learning in engineering: the challenge of scaling up assessment. In M. J. Jacobson, & R. B. Kozma (Eds.),
Innovations in science and mathematics education: Advanced designs for technologies of learning (pp. 227–257). Mahwah, NJ: Lawrence Erlbaum Associates.
Hakkarainen, K., & Palonen, T. (2003). Patterns of female and male students’ participation in peer interaction in computer-supported learning. Computers & Education, 40(4),
327–342.
Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis (6th ed.). Upper Saddle River, NJ: Pearson Education.
Hewitt, J. (2005). Toward an understanding of how threads die in asynchronous computer conferences. Journal of the Learning Sciences, 14(4), 567–589.
Hrastinski, S. (2009). A theory of online learning as online participation. Computers & Education, 52(1), 78–82.
C.K.K. Chan, Y.-Y. Chan / Computers & Education 57 (2011) 1445–1457
1457
Hrastinski, S. (2008). What is online learner participation? A literature review. Computers & Education, 51(4), 1755–1765.
Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–
55.
Jacobson, M. J., & Kozma, R. B. (Eds.). (2000). Innovations in science and mathematics education: Advanced designs for technologies of learning. Mahwah, NJ: Lawrence Erlbaum
Associates.
Kember, D., Biggs, J., & Leung, D. Y. P. (2004). Examining the multidimensionality of approaches to learning through the development of a revised version of the Learning
Process Questionnaire. British Journal of Educational Psychology, 74(2), 261–280.
Kline, R. B. (1998). Principles and practice of structural equation modeling. New York: Guilford Press.
Kreijns, K., Kirschner, P. A., & Jochems, W. (2003). Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: a review of the
research. Computers in Human Behavior, 19(3), 335–353.
Lai, K. W. (2008). ICT supporting the learning process: the premise, reality, and promise. In J. Voogt, & G. Knezek (Eds.), International handbook of information technology in
primary and secondary education (pp. 215–230). New York: Springer.
Lai, M., & Law, N. (2006). Peer scaffolding of knowledge building through collaborative groups with differential learning experiences. Journal of Educational Computing
Research, 35(2), 123–144.
Law, Y. K., Chan, C. K. K., & Sachs, J. (2008). Beliefs about learning, self-regulated strategies and text comprehension among Chines children. British Journal of Educational
Psychology, 78(1), 51–73.
Lee, E. Y. C., Chan, C. K. K., & van Aalst, J. (2006). Students assessing their own collaborative knowledge building. International Journal of Computer-Supported Collaborative
Learning, 1(2), 277–307.
Linn, M. C., Davis, E. A., & Bell, P. (2004). Internet environments for science education. Mahwah, NJ: Lawrence Erlbaum Associates.
Lipponen, L., Rahikainen, M., Lallimo, J., & Hakkarainen, K. (2003). Patterns of participation and discourse in elementary students’ computer-supported collaborative learning.
Learning and Instruction, 13(5), 487–509.
McDonald, R. P., & Ho, M. H. R. (2002). Principles and practice in reporting structural equation analyses. Psychological Methods, 7(1), 64–82.
Marton, F., & Booth, S. (1997). Learning and awareness. Mahwah, NJ: Lawrence Erlbaum Associates.
Marton, F., & Säljö, R. (1976). On qualitative differences in learning: I. Outcome and process. British Journal of Educational Psychology, 46(1), 4–11.
Mason, L., & Boldrin, A. (2010). Epistemic metacognition in the context of information searching on the web. In M. S. Khine (Ed.), Knowing, knowledge and beliefs epistemological studies across diverse cultures (pp. 377–404). Dordrecht: Springer.
Mason, L., & Scirica, F. (2006). Prediction of students’ argumentation skills about controversial topics by epistemological understanding. Learning and Instruction, 16(5), 492–
509.
Niu, H., & van Aalst, J. (2009). Participation in knowledge-building discourse: an analysis of online discussions in Mainstreams and Honours Social Studies Courses. Canadian
Journal of Learning and Technology, 35. Available at http://www.cjlt.ca/index.php/cjlt/article/view/515.
Peng, H., Tsai, C. C., & Wu, Y. T. (2006). University students’ self-efficacy and their attitudes towards the Internet: the role of students’ perceptions of the Internet. Educational
Studies, 32(1), 73–86.
Phan, H. P. (2007). The revised learning process questionnaire: a validation of a Western model of students’ study approaches to the South Pacific context using confirmatory
factor analysis. British Journal of Educational Pyschology, 77(3), 719–739.
Prinsen, F. R., Volman, M. L. L., & Terwel, J. (2007). The influence of learner characteristics on degree and type of participation in a CSCL environment. British Journal of
Educational Technology, 38(6), 1037–1055.
Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education. Philadelphia, PA: Society for Research into Higher Education & Open
University Press.
Roschelle, J., & Pea, R. (1999). Trajectories from today’s WWW to a powerful educational infrastructure. Educational Researcher, 5, 22–43.
Scardamalia, M. (2002). Collective cognitive responsibility for the advancement of knowledge. In B. Smith (Ed.), Liberal education in a knowledge society (pp. 67–98). Chicago,
IL: Open Court.
Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. The Journal of the Learning Sciences, 3(3), 265–283.
Scardamalia, M., & Bereiter, C. (2006). Knowledge building: theory, pedagogy, and technology. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 97–
119). New York: Cambridge University Press.
So, H. J., Seah, L. H., & Toh-Heng, H. L. (2010). Designing collaborative knowledge building environments accessible to all learners: impacts and design challenges. Computers &
Education, 54(2), 479–490.
Stahl, G., Koschmann, T. D., & Suthers, D. (2006). Computer-supported collaborative learning: an historical perspective. In R. K. Sawyer (Ed.), Cambridge handbook of the
learning sciences (pp. 409–426). New York: Cambridge University Press.
Tabachnik, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed).. Boston, MA: Pearson Education.
Tsai, C. C. (2009). Conceptions of learning versus conceptions of web-based learning: the differences revealed by college students. Computers & Education, 53(4), 1092–1103.
Tsai, C. C., Lin, S. J., & Tsai, M. J. (2001). Developing an internet attitude scale for high school students. Computers & Education, 37(1), 41–51.
Tu, Y. W., Shih, M., & Tsai, C. C. (2008). Eight graders’ web searching strategies and outcomes: the role of task types, web experiences and epistemological beliefs. Computers &
Education, 51(3), 1142–1153.
Voogt, J., & Knezek, G. (2008). IT in primary and secondary education: Emerging issues. In J. Voogt, & G. Knezek (Eds.), International handbook of information technology in
primary and secondary Education. New York: Springer.
Watkins, D. A. (1998). Assessing approaches to learning: a cross-cultural perspective. In B. Dart, & G. Bouton-Lewis (Eds.), Teaching and learning in higher education (pp. 124–
144). Melbourne: Australian Council for Educational Research.
Yang, F. Y., & Tsai, C. C. (2008). Investigating university student preferences and beliefs about learning in the web-based context. Computers and Education, 50(4), 1284–1303.
Zhang, J., Scardamalia, M., Lamon, M., Messina, R., & Reeve, R. (2007). Socio-cognitive dynamics of knowledge building in the work of 9- and 10-year-olds. Educational
Technology Research & Development, 55(2), 117–145.
Zhang, J., Scardamalia, M., Reeve, R., & Messina, R. (2009). Designs for collective cognitive responsibility in knowledge-building communities. The Journal of the Learning
Sciences, 18(1), 7–44.
Zhang, L. F., & Sternberg, R. J. (2006). The nature of intellectual styles. Mahwah, NJ: Lawrence Erlbaum Associates.