IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
An Intelligent Interface for Learning Content: Combining an
Open Learner Model and Social Comparison to Support
Self-Regulated Learning and Engagement
Julio Guerra
Instituto de Informática, Universidad Austral de
Chile, Valdivia, Chile
School of Information Sciences, University of
Pittsburgh, Pittsburgh, PA, USA
jdg60@pitt.edu
Sibel Somyurek
Department of Computer Education and
Instructional Technologies, Gazi University
Ankara, Turkey
ssomyurek@gazi.edu.tr
Roya Hosseini
Intelligent Systems Program, University of
Pittsburgh
Pittsburgh, PA, USA
roh38@pitt.edu
Peter Brusilovsky
School of Information Sciences, University of
Pittsburgh
Pittsburgh, PA, USA
peterb@pitt.edu
science expect students to work with simulations, examine
worked-out examples, and practice their skills by solving a
large number of problems. Only a limited amount of this
practice (typically problems) is usually assigned to students
as a part of course requirements; the rest is recommended
for students to complete as practice. This division is welljustified, since different students need considerably different
amount of practice. Historically, practice examples and problems were provided by textbook authors at the end of every
chapter, and teachers recommended them as practice assignments when this chapter is assigned for reading. However,
the increased use of computers as learning tools has gradually led to creation of large numbers of computer-based practice systems oriented toward student self-study. For example,
in the area of learning programming, we can distinguish several types of practice content that is now supported by many
practice-oriented systems: worked-out examples that allow
students to examine meaningful problem-solving approaches
[6]; program animations that visually demonstrate program
behavior [15]; program tracing exercises that engage students
in understanding language semantics [5]; and program construction exercises that allow students to practice their composition skills [26].
ABSTRACT
We present the Mastery Grids system, an intelligent interface
for online learning content that combines open learner modeling (OLM) and social comparison features. We grounded
the design of Mastery Grids in self-regulated learning and
learning motivation theories, as well as in our past work in
social comparison, OLM, and adaptive navigation support.
The force behind the interface is the combination of adaptive
navigation functionality with the mastery-oriented aspects of
OLM and the performance-oriented aspects of social comparison. We examined different configurations of Mastery Grids
in two classroom studies and report the results of analysis of
log data and survey responses. The results show how Mastery Grids interacts with different factors, like gender and
achievement-goal orientation, and ultimately, its impact on
student engagement, performance, and motivation.
Author Keywords
Open Learner Model; Achievement-Goal Orientation;
Self-Regulated Learning; Social Comparison
ACM Classification Keywords
H.5.m. Information Interfaces and Presentation (e.g. HCI):
Miscellaneous; K.3.1. Computers and Education: Computer
Uses in Education
Years of studies have demonstrated the strong educational
benefits of each of the listed kinds of practice content. Moreover, the recent advancement of Web technologies has led to
the steady transition of many practice-oriented systems from
stand-alone to Web-based, with solid volumes of practice
content accessible online. Surprisingly, these shifts have not
led to a similar revolution in the use of this practice content.
As observed by a recent report [3], overall student exposure
to this content is low.
INTRODUCTION
In many courses offered by modern universities, mastery of a
subject comes only after considerable practice. Subjects like
college-level mathematics, physics, chemistry, or computer
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from Permissions@acm.org.
IUI’16, March 07 - 10, 2016, Sonoma, CA, USA
Copyright is held by the owner/author(s). Publication rights licensed to ACM
ACM 978-1-4503-4137-0/16/03 ...$15.00
DOI: http://dx.doi.org/10.1145/2856767.2856784
In this work, we present the Mastery Grids system, an intelligent interface for online learning that attempts to resolve
this unfortunate situation and considerably increase student
work with several types of practice content, from examples
to problems. The Mastery Grids interface combines several
152
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
causes students to complete more practice content, and allows
them to move quickly to the next content items [13, 4].
design decisions to address several known obstacles to work
with practice content: all types of content are accessible from
the same integrated interface, avoiding the need to log into
different single-content systems; student-changing state of
knowledge is visualized in the form of an open learned model
(OLM) to support self-regulated learning and guide students
to the most appropriate practice topics and content; social
comparison in the form of peer and class progress is offered
to provide additional motivation and navigation support. We
grounded the design of Mastery Grids in self-regulated learning and learning motivation theories, as well as in our past
work in social comparison, OLM, and adaptive navigation
support. The force behind the interface is the combination of
adaptive navigation functionality with the Mastery-oriented
aspects of OLM and the Performance-oriented aspects of social comparison. To assess the value of our approach, as well
as specific design decisions, we examined different configurations of Mastery Grids in two classroom studies and report the
results of analysis of the log data and survey responses. The
results show the overall impact of our technology on student
engagement, performance, and motivation, as well as insights
in the ways in which Mastery Grids interacts with different
factors, like gender and achievement-goal orientation.
Self-Regulated Learning and Achievement-Goal
Self-regulated learning (SRL) defines the learner as an active participant who monitors and applies strategies to control her own learning process cognitively, metacognitively,
and emotionally [29]. A key aspect in SRL is the interdependence between motivation and self-regulating processes [29].
For example, self-efficacy, an element of self-regulation, is
viewed as the force behind learning motivation [2] and has
a strong influence in goal-setting [23]. Moreover, a selfregulated learner, “who proactively seek[s] out information
when needed and take[s] the necessary steps to master it”
[28], is closely related to the Mastery-Approach orientation defined by the Achievement-Goal Orientation framework [11]. This framework argues that students’ motivation while facing a learning situation translates to different
types of goals, and defines a 2x2 classification of goal orientations: Mastery-Approach oriented students pursue learning, while Performance-Approach oriented students pursue
achievement and are usually more sensitive to comparison
and scores. Mastery-Avoidance students avoid not to learn
or learn too little, and Performance-Avoidance students avoid
to perform worse than others or receive low scores. Relationships between SRL and achievement goals are well studied. For example, students who present high levels of mastery
goal orientation and who are intrinsically motivated to learn
also present higher levels of SRL elements, including higher
self-efficacy and the use of self-regulatory strategies [27].
BACKGROUND
Open Learner Model and Open Social Student Modeling
Open learner models (OLMs), which are also referred to as
open student models (OSMs), are an important kind of educational interface that has emerged in the area of personalized learning systems. While in traditional personalized systems, student models that represent the current state of student knowledge are hidden from students and are used solely
to personalize the educational process, more recent open student model interfaces introduced the ability to view, explore,
and even modify the state of a student’s own knowledge [8].
Typically, the state of knowledge is displayed using some
form of visualization, such as a set of skillometers [19] or
a knowledge map [16]. By visualizing, and sometimes, interacting with her own learning or achievement representation,
the learner is given a powerful feedback tool that is known to
impact her metacognition and self-regulated learning strategies [9].
Researchers have also studied how different goal orientations
can be fostered, a finding that suggests that mastery oriented
environmental factors (such as learning autonomy) can promote the adoption of mastery goals [10], while performanceoriented elements, like score rankings, can account for the
adoption of performance goals, such as competition to score
higher than other students [20]. While mastery and performance orientations seem to represent opposite values, they
can coexist: a student can present high levels of performanceand mastery-oriented goals at the same time [1]. Our system
follows these ideas by integrating mastery orientation provided by OLM, which allows the learner to monitor her own
learning progress, and performance orientation is provided by
social comparisons features.
Recent projects attempted to enhance the value of OLMs in
several ways. The most popular among these efforts is integrating OLM with interfaces to access learning content [18,
25, 22]. In this way, an OLM provides personalized navigation support that helps students to select the most appropriate
content for their skill levels. Studies show that this kind of
personalized support could increase the efficiency of student
work and also increase their motivation to work with learning
content [18, 25].
MASTERY GRIDS: AN INTELLIGENT INTERFACE FOR
PRACTICE-ORIENTED LEARNING CONTENT
The Mastery Grids system is an attempt to design an intelligent interface for accessing learning content that provides
support for SRL and allows learners to monitor their course
progress. In its core, it follows our earlier work that integrated
content navigation with OLM-based knowledge progress visualization [13]. To complement the benefits of OLM, Mastery Grids (MG) also engages the power of the Open Social
Student Model by incorporating visualization based on the
models of other students. The MG interface presented below
adds several features to its first version presented in [17].
Other researchers have attempted to enhance the engagement
power of OLMs by using ideas of social comparison; namely,
by allowing students to explore each others’ models [7] or to
view a cumulative model of the class [13, 4], thus triggering
social comparison effects. This approach, which has been
referred as Open Social Student Modeling, has demonstrated
a considerable positive effect on engagement and efficiency,
153
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
The interface of the Mastery Grids system (Figure 1) is
composed of several grids showing different aggregations of
progress information. In all grids, columns represent topics
and rows represent different types of content (such as problems, examples, or animations). The first grid shows an extended OLM that visualizes the learner’s own progress over
several kinds of practice content. Each cell shows knowledge progress, with one kind of content for one topic using
a different density of green color. The third grid (number 2
in Figure 1) represents the average progress of the reference
group using a varying density of blue color. A combo box in
the menu bar allows the student to use the whole class, or just
the top students, as a reference group (number 7 in Figure 1).
Different colors were used to represent the group average and
the learner (blue and green, respectively) in order to implement the differential comparison in the second grid (number
3 in Figure 1). When the group has a higher progress than the
learner on a specific kind of content in a specific topic, the
corresponding cell in the second grid becomes blue. Otherwise, it becomes green. The intensity of the color represents
the intensity of the difference, and a gray color means there
is no difference.
In the bottom part (number 4 in Figure 1), a progress grid
for each of the students in the group is shown (with the top
progressing students shown first). The list does not show the
names of the students. To be consistent with the colors used
in the first grid, each peer grid is represented in shades of blue
and the learner is represented in shades of green, which also
facilitates locating the learner in the list. Figure 1 shows the
learner in the 3rd position of the list. To speed up the interface
loading, the ranked list of peers is only shown when clicking
in the button “Load the rest of the learners,” which is located
below the 3rd grid and does not appear in Figure 1.
Figure 1. The full Mastery Grids interface. A menu bar contains controls to change the view of the group or the details shown. Circled numbers have been added in the image to support explanations.
By clicking on any topic cell, the user can access the practice content of this topic, shown as activity cells organized
in rows by content type (number 5 in Figure 1). By clicking in the activity cells, the content is loaded in an overlaid
window. Mastery Grids integrates different types of online
“smart” content from different content providers. “Smart”
content [3] interactively engages students, provides mechanisms to store and retrieve student activity data, and ultimately, incorporates feedback mechanisms. The captured activity data are processed by MG’s back-end services to compute knowledge progress. For example, each programming
problem included in the Mastery Grids system shows the correct answer to the student and logs the student response in
a User Model server, which can provide summaries of this
data back to Mastery Grids. The version presented in this
paper includes three types of smart content for a Java programming course: programming problems [14], interactive
examples [6], and program animations [24].
tent type (as shown in Figure 1), and can also select to display averages by the type of content (for example, showing
only progress in examples), or an overall view where all the
three first grids are collapsed in one grid with one row for the
learner progress, one row for the comparison, and one row for
the group progress, as shown in Figure 3. The overall view
mode is set as the default view. In addition, all comparison
features can also be completely hidden by clicking the button
“Individual” (number 6 in Figure 1), which leaves only the
personal part of the interface visible, as shown in Figure 2.
The Mastery Grid interface can be configured to hide or show
the menu controls (numbers 6, 7, and 8 in Figure 1), as well
as to enable or disable the OSSM features for a specific group
or for individual users. For example, this allows us to show
social comparison features only to some students, or to enable
all features for the instructor. Figures 2 and 3 show different
configurations of Mastery Grids both with and without socialcomparison features, respectively.
To allow for the exploration of a broader design space, different interface components can be loaded in different combinations. A selector widget in the menu bar allows students to
select among different progress visualizations for the different content types (number 8 in Figure 1). The user can choose
a full view in which each grid has separate rows for each con-
CLASSROOM STUDIES
In this work, we reported the analyses and results of two
semester-long classroom studies in an introductory ObjectOriented Java programming class during 2014-2015. Classes
were taught by the same instructors and had the same setup.
154
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
Table 1. Number of students participating in the studies. Number of
female and male students in parentheses.
Resources
Figure 2. Mastery Grids with minimal features or individual view (all
social comparison features have been disabled).
Social
Resources On
Resources Off
Total
OSM
23 (12, 10)
20 (8, 12)
43
OSSM
22 (11, 11)
24 (7, 15)
46
Total
45
44
89
or C (keep the individual view), resources-on or R (add the
resource selector feature in the menu), social-on or S (add
the social comparison features), social-resources or SR (add
both social comparison features and resource selector). At
the beginning of Part 2, an email was sent to all students with
a reminder of the study, a reminder of instructions to access
the system, and a brief explanation of the new features introduced (for the control group, the email contained only the reminders). A week before the midterm, the Achievement-Goal
questionnaire was applied again. At the end of the term and
the end of Part 2, a post-test, the achievement-goal questionnaire, and a usability/usefulness survey of the Mastery Grids
system were administered. Consent forms were also collected
in order to use the data from questionnaires. The three moments wherein the achievement-goal factors were measured
are referred to as initial, middle, and final measures.
Figure 3. Mastery Grids with social comparison features and collapsed
grids, averaging the progress across types of resources.
The studies were designed to explore the engagement effects
of the system features, especially the social comparison elements presented in Mastery Grids, and to examine the mediating impact of the learner’s motivational characteristics.
To support this goal, different features of the system were
enabled or disabled to different student groups, which is explained in detail in the next section.
Activity on the system occurring during Part 1 of the study
is referred to as activity before conditions were introduced or
simply, Part 1. Similarly, activity during Part 2 is referred to
as activity after conditions were introduced or Part 2. Recall
that during Part 1, all students had access to the same basic
features of the system (Figure 2).
To characterize learners’ motivation, we choose the
Achievement-Goal Orientation Framework [11] because of
its strong relations to Self-Regulated Learning as depicted in
the Background section. To measure goal orientation, we use
the Achievement-Goal orientation questionnaire [11], which
contains 12 questions in a 7-point Likert scale, which corresponds to 3 questions for each of the Achievement-Goal factors: Mastery-Approach (“My goal is to learn as much as possible”), Mastery-Avoidance (“My aim is to avoid learning less
than I possibly could”), Performance-Approach (“My goal is
to perform better than the other students”), and PerformanceAvoidance (“I strive to avoid performing worse than others”).
The system usage data includes logs of sessions (different
log-ins into the system), problems attempted and solved, examples displayed, animations displayed, and other interface
interactions, like changes on the resource selector or group
selector menu and clicks on the button “load others” (see description of Mastery Grids and Figure 1). Time spent on activities was also recorded. Engagement is measured in absolute scores of system activity and relative scores of activity or
time by session in the system.
In total, 89 students who logged into the system at least once
(38 female, 48 male; 3 students did not provide gender information) out of 114 students enrolled were used for the analyses in this paper. The number of students in each of the groups
can be seen in Table 1. Female and male student counts are
shown in parentheses.
Classroom setup and data collection
Each classroom study was divided into 2 parts. Part 1 started
at the beginning of the term, when some details of the study
and the Mastery Grids system were introduced to the students
with a live demonstration. The Mastery Grids system was
initially set up with minimal features, as shown in Figure 2.
Instructors offered around 1% of extra credit for participating in the study, including solving at least 10 problems in the
system. A pretest was applied in conjunction with an initial
measure of the Achievement-Goal questionnaire.
In the next sections, we first present analyses on the difference between OSM and OSSM interfaces; i.e. the OSM group
includes C and R groups; the OSSM group with social features enabled is the combination of groups S and SR. Then,
we briefly analyze the resource selector feature on system usage.
Part 2 started in the fifth week, 3 weeks before the midterm.
At this time, students were randomly classified in 4 groups,
according to the presence of social features (social factor)
and/or the presence of the resource selector component (resource factor). Pretest scores also balanced the assignment
to groups. The four groups are named as follows: control
ASSESSING THE OVERALL VALUE OF THE INTERFACE
To explore the value of both the OSM and OSSM, we compared the usage statistics of our classroom studies using the
Mastery Grids system to a previous classroom study offered
155
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
Table 2. The Mean±SD of system usage statistics: comparison between a portal of course materials and the Mastery Grids system across all groups
(MG), the OSM group, and the OSSM group.
Parameters
Logged-in students
Active students
Sessions
Distinct topics
Problem attempts
Distinct problems
Distinct problems solved
Success rate
Repeats per problem
Success per problem
Failure per problem
Examples viewed
Portal
MG
p-value
OSM
p-value
17
89
43
14 (82%)
80 (90%)
40 (93%)
2.71±1.49
7.54±6.05
.000 ***
7.45±5.76
.000 ***
9.21±4.85
9.4±5.6
.97
9.3±5.71
.89
72.36±67.25 78.88±62.18
.556
76.45±54.33
.534
32.79±21.67 43.74±28.26
.21
43.6±28.25
.15
28.43±19.2 41.92±28.13
.126
41.88±28.7
.152
.707±.147
.648±.144
.182
.639±.147
.153
2.52±1.81
1.8±0.77
.318
1.83±0.84
.402
1.8±1.36
1.11±0.35
.1
1.11±0.33
.139
0.72±0.61
0.69±0.56
.534
0.72±0.67
.418
13.27±9
32.52±26.23
.019 *
31.02±26.67
.035 *
significant level: ***: <.000; **: <.01; *: <.05; .: <.1
in Spring 2008 in which students used a traditional nonadaptive portal to access comparable volumes of the same
practice content. The Portal presented the content in a simple two-level hierarchy of HTTP links, the first level offered
a list of topics and the second level listed links to parameterized problems and interactive examples. Table 2 provides
the summary of usage statistics of the Portal and the Mastery Grids system across all groups (MG), the OSM group,
and the OSSM group. Usage statistics are computed based on
the data of active users who had at least 5 problem attempts
in the portal, whole system (MG), or in a specific condition
(OSM or OSSM). Because the OSSM group did not have social features during Part 1, we consider as logged-in OSSM
users those who logged in during the Part 2 (34 out of 46) and
as active OSSM users those who have at least 5 problem attempts in Part 2 (i.e, have a reasonable chance to experience
”true” OSSM condition). This explains why active users for
both the OSM and OSSM groups does not add up to the total
active users in the MG column: there are 10 students in the
OSSM group who had enough activity to be considered as
active MG users but did not have enough activity in Part 2 to
be considered as active OSSM students. In other words, only
30 OSSM students who attempted 5 or more problems during Part 2 are considered as active OSSM students. Columns
four, six, and eight of the table report the observed significance level (p-values) for the statistical tests that were carried
out to compare each usage parameter between the Portal and
MG, between the Portal and the OSM group, and between the
Portal and the OSSM group, respectively1 .
OSSM
p-value
34
30 (88%)
9.37±6.49
11.47±4.75
100.83±69.51
53.2±25.86
50.77±25.57
.627±.127
1.85±0.74
1.12±0.39
0.73±0.45
41.57±24.67
.000 ***
.192
.07 .
.01 *
.003 **
.094 .
.579
.098 .
.338
.000 ***
This difference becomes significant for the OSSM group,
where they accessed about 1.6 times more distinct problems
than in the Portal. This indicates that the navigation support
available in Mastery Grids decreases students tendency of
staying with the same content (for example, repeating problems they have already mastered), and as a result, students
moved on to new problems more quickly. This data correlates (but not significantly) with a slightly lower success rate
in the Mastery Grids system. Our data shows that in the absence of mastery indicators and navigation support offering
guidance across course topics, students tended to over-stay
within topics repeating the same problems even after solving
them correctly, which resulted in a larger fraction of successful attempts on the same problems.
These observations indicate that the Mastery Grids system is
more beneficial than a traditional portal, in terms of student
engagement and effort allocation. In the next sections, we
look beyond the engagement level and explore different aspects of the tool, from its impact on student performance to
how it affects student navigation patterns, as well as how it
interacts with motivational factors.
ASSESSING THE VALUE OF SOCIAL COMPARISON
Effects of OSSM on engagement
To analyze the value of the social comparison component
within the overall context of the Mastery Grids system, we
compared engagement differences between OSM (C + R) and
OSSM (S + SR) groups by examining usage statistics and patterns both before and after conditions were introduced (Part
1 and Part 2). Table 3 shows statistics of usage during Part
1 and Part 2, and within the OSM and OSSM groups. The
first row shows the number of students who logged into the
system in the corresponding group and part. The other rows
show the average usage parameters per student.
According to Table 2, the Mastery Grids system performed
significantly better than the Portal in engaging students
to work with the non-mandatory course materials. The
OSM/OSSM interface made the Mastery Grids system arguably more addictive than the basic portal: the average
number of sessions and examples viewed were significantly
higher in all conditions of the Mastery Grids system (MG,
OSM, and OSSM). Progress tracking also allowed students to
better distribute their efforts: on average, when using Mastery
Grids, students explored and solved more distinct problems.
There are no significant differences on usage statistics between social and non-social groups before the OSSM conditions were introduced (Part 1 columns in Table 3), which
suggests that groups were balanced. Looking at the OSSM
(Social) columns in both Part 1 and Part 2, we note a trend of
increase of activity and time on the system, while the OSM
group tended to decrease the activity from Part 1 to Part 2.
1
The One-Way ANOVA or Kruskal-Wallis H test was used, depending on whether the assumptions of the parametric test were met.
156
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
Table 3. Mean and Standard Error (in parentheses) of student usage
statistics in Part 1 and 2 of the studies for OSM and OSSM groups.
Part 1
Students
Sessions
Problem attempts
Problems solved
Success rate
Examples viewed
Animations viewed
Time on the system
Time on problems
Time on examples
Time on animations
Clicks group selector
Clicks load others
Time by session
Activity by session
OSM
43
3.26 (.49)
40.7 (6.2)
22.7 (2.99)
.66 (.03)
16 (2.6)
4.84 (.94)
4905 (1043)
1386 (245)
1342 (356)
568 (234)
OSSM
46
3.5 (.49)
34.9 (6.0)
20 (3.2)
.694 (.027)
16.9 (2.8)
4.3 (.99)
4803 (1136)
1606 (568)
1239 (278)
375 (108)
Part 2
OSM
35
4.77 (.58)
37.3 (5.2)
20.3 (3.2)
.57 (.03)
16.6 (2.8)
5.14 (1.28)
6416 (1016)
2317 (406)
1605 (353)
452 (117)
OSSM
34
4.5 (.68)
48.4 (7.6)
24.1 (3.5)
.58 (.03)
18.4 (2.8)
7.62 (1.41)
5955 (984)
2204 (396)
1162 (231)
595 (146)
1.15 (.47)
1.06 (.30)
1633 (331) 1328 (230) 1247 (154) 1501 (244)
21.38 (3.49) 17.05 (2.29) 12.87 (1.75) 23.58 (4.96)
Figure 4. Interaction between Time (Part 1, Part 2) and Social factor
(OSM/OSSM).
Note also how time measures increase in both groups, and
that this increase is smaller in the OSSM group. To see
the differences in the increase of activity or time, repeatedmeasures ANOVA on activity statistics measured in Part 1
and Part 2 was performed among OSM and OSSM groups. In
these analyses, we also include a measure of the total activity
by session (at the bottom of Table 3) which sums problems attempts, examples, and animated examples displayed by each
student, and divides this sum by the number of sessions the
student has. Only students who have activity in both Part 1
and Part 2 were considered. Additionally, outliers were found
and removed for the statistics time by session (1 student presented an extremely long session) and activity by session (4
students showed 1 session with an extremely high number of
activities in either Part 1 or Part 2).
Figure 5. Examples displayed by female and male students in the OSSM
during Part 1 and Part 2.
like gender or previous experience did not result in significant
predictors and did not enter the step-wise regression model.
This confirms the role of social features in engaging students
to do more activities.
Effect in performance and instructional effectiveness
We evaluated the effect of the OSM and OSSM on student
performance using four measures: (1) normalized learning
gain, which is defined as the actual gain divided by the possible gain; (2) course grade; (3) success rate, which is the
percentage of correct attempts on problems; and (4) effectiveness scores. We conducted several analyses to compare
performance differences between the groups (OSM/OSSM).
We did not find any differences in the learning gain, success
rate, or course grade across the two groups. However, there
were some differences in the effectiveness score.
We found a significant difference on success rate on problems
(p = .007) between Part 1 and Part 2, which was expected
given that Part 2 involves generally more complex problems.
A significant interaction between Time (Part 1, Part 2) and
Social group (OSM/OSSM) factors was found on the amount
of activity by session, F (1, 55) = 4.972, p = 0.03, ηp2 = .083.
As it can be seen in Figure 4, students in the OSSM group increased their number of activities per session, while students
in the OSM group decreased it. Another significant interaction was found for the factors Time, Gender and Social group
on the number of examples displayed, F (1, 45) = 6.467, p =
.014, ηp2 = 0.126. Female students in the OSSM group tended
to increase the number of examples displayed from Part 1 to
Part 2, while male students decreased the number of examples displayed, and both female and male students decreased
the number of examples displayed in the OSM group (Figure
5).
Effectiveness score: To compare instructional effectiveness
scores between the OSM and OSSM groups, we looked into
instructional effectiveness scores of students, following the
approach in [21]. First, we computed z − scores of the performance pz (correctly answered problems) and time tz (time
spent on problems). Then, the relative effectiveness of a student is computed as the distance between point (pz ,tz ) and
the line of zero (0) effectiveness.
For this analysis, we used data from students who attempted
at least 5 problems. Although, we did not find a significant difference on the effectiveness scores between the OSM
and OSSM groups, differences were found in the change of
effectiveness scores among the groups from Part 1 to Part
2. A repeated-measure analysis of variance with both group
(OSM/OSSM) and gender as factors showed the main effect
of time (Part 1, Part 2) is significant (F (1, 40) = 27.02,
p < .001). The within-subject test indicates that the interaction of time and group is also significant (F (1, 40) = 4.72,
p = .036), in Part 2 the effectiveness scores of the OSSM
group (M = 0.18, SE = .426) were higher than in the OSM
We compute the total number of activities by summing problem attempts, distinct examples, and animations displayed. A
step-wise regression performed on the total activity in Part 2
showed that total activity in Part 1 (β = 0.515, SE = .115)
and social group factor (β = 30.431, SE = 12.53) are both
significant predictors, p <.001 and p=.019, respectively. Being in the OSSM group means an increase of about 30 activities, as compared to being in the OSM group. Other factors
157
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
group (M = −2.81, SE = .389). Also, the interaction of
gender and group was marginally significant (F (1, 40) = 3 :
59, p = .065), male students in the OSSM group had higher
efficiency scores (M = 0.12, SE = 0.40) than male students
in the OSM group (M = −0.48, SE = 0.37) during Part 2.
In general, we observed a tendency to decrease the effectiveness scores from Part 1 to Part 2, except for male students in
the OSSM group.
high by median split), gender, group (OSM/OSSM), and total ratio of non-sequential navigation patterns over the whole
semester. The results of the regression showed that only the
total ratio of non-sequential navigation made a significant
contribution to the learning gain (β = 0.66, SE = 0.28,
p < 0.03). The non-sequential navigation patterns also explained a significant proportion of variance in the learning
gain R2 = .20, F (2, 30) = 3.65, p < .04.
By putting all these results together, the non-sequential patterns increased more in the OSM group than in the OSSM
group. This could be due to the social nature of the OSSM
that makes students more conservative in their navigation –
they tend to sequentially follow their peers rather than browsing the content space by their own, which is often a nonsequential process. More interestingly, there was a positive
association between non-sequential navigation patterns and
learning gain: those who had a higher proportion of nonsequential patterns gained more knowledge. Although the
two groups (OSM and OSSM) were not different in terms
of the learning gain, this suggests that students in the social
group might gain more knowledge if other adaptive features
are added to the social interface, such as individual or personalized guidance. Future studies should be conducted to
investigate this hypothesis.
Effects on navigational patterns
So far, we provided high-level analysis of the intelligent interface in the Mastery Grids system, showed the value of OSSM
on engagement, and explored its impact on student performance. In this section, we seek to extend our previous analysis by looking at a micro-level analysis of students’ navigation patterns with the goal to understand how the sequence
of students’ clicks on Mastery Grids cells differs between the
OSSM and OSM interface.
We started by classifying students’ moves from the current
cell to the next cell in the Mastery Grids system into four
different groups: (1) within-topic, which indicates a move
to a cell in the same topic; (2) next-topic, which indicates
a move from a cell in a topic to the cell in the next topic
(according to the sequence of topics in the course), (3) jumpforward, which indicates a jump to a cell in a topic that two or
more steps further away from the current topic, and (4) jumpbackward, which indicates a jump to a cell in an earlier topic.
A within-topic or next-topic move represents a sequential navigation, while a jump-forward or jump-backward move represents a non-sequential navigation pattern. We then computed
the ratio of non-sequential navigation for 58 students who had
at least one attempt in both Part 1 and Part 2 of the study,
and excluded 4 students who had non-sequential navigation
scores larger than 2 × SD from the mean. To ensure that
these ratios are stable, we considered only 39 (OSM/OSSM :
20/19) students with at least 5 problem attempts in Part 1 and
at least 10 problem attempts in Part 2 (the smaller threshold
in part 1 is due to the shorter length of the part).
The role of the Achievement-Goal orientation
We now analyze how the motivational profile of the students, as characterized by their Achievement-Goal orientation [11], affects overall engagement in the system and
learning effectiveness. For each student, the responses of
the Achievement-Goal questionnaire were grouped and averaged by each of the factors. As a result, each student has
four scores (Mastery-Approach score, Mastery-Avoidance
score, Performance-Approach score, Performance-Avoidance
score) for the three moments in the term in which the questionnaires were applied: initial, middle, and final measures.
We first analyze differences in usage between low or high
achievement-goal students by classifying students using the
median of each of the achievement-goal factors. We used
the middle Achievement-Goal measures (measures at the
middle of the term) because it is more representative of
the state of the goal orientation of the students while using the system. Only 54 students who answered the questionnaire and gave consent are included in these analyses.
We found a clear difference on usage only among low and
high Mastery-Approach students, while other classifications
did not show significant differences. Students in the high
Mastery-Approach group have a higher level of activity in all
the measures during Part 1. In Part 2, this effect disappeared.
A 2x2 ANOVA on usage statistics on Part 2 by the MasteryApproach factor (low/high) and the group (OSM, OSSM) did
not result in any significant main effect or interaction. This
suggests that the motivational effect that social features have
on the increase of activity observed in the OSSM group might
be of another nature than Mastery orientation, and are perhaps related to Performance orientation or a change of Performance orientation during the term.
To explore the differences in student patterns between the
OSM and OSSM groups, we conducted a repeated measures ANOVA on the non-sequential ratios (Part 1 and Part
2) among OSM/OSSM groups. There was no difference in
the non-sequential patterns between the OSM and the OSSM
group, F (1, 37) = 0.50, p > .05. Similarly, there was no difference in those patterns both before and after social features
were enabled in the OSSM; namely, in Part 1 and Part 2 of
the study F (1, 37) = 1.61, p > .05. However, there was a
significant interaction between time (Part1, Part2) and group
(OSM, OSSM), F (1, 37) = 3.01, p < .10. More specifically,
in the OSM group, the ratio of non-sequential navigation was
higher in Part 2 (M = 0.16, SD = 0.08) than in Part 1
(M = 0.11, SD = 0.08).
To investigate further the relationship between the nonsequential patterns on the amount of learning, we performed
a step-wise regression to predict the normalized learning gain
of students in terms of the following factors: number of total activities on problems and examples, pretest group (low or
158
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
Figure 7. Contrast of the change of the proportion of non-sequential patterns among students in the low and high groups of Mastery-Approach
orientation.
Figure 6. Different decrease of Performance-Approach orientation of
students in the OSM and OSSM groups.
ρ = −.378, p = .043, N = 29. This suggests that highly motivated students are more sequential in their patterns of navigation. A significant negative correlation was also found in
the difference of the proportion of the non-sequential patterns
(Part 2 - Part 1) and Mastery-Approach level, ρ = −.429, p =
.02, N = 29. When looking at the OSM and OSSM groups
separately, the negative correlation between the difference
of non-sequentiality and the Mastery-Approach orientation is
stronger in the OSSM group ρ = −.62, p = .018, N = 14,
and is not significant in the OSM group. These results suggest
that more motivated students become more sequential in their
patterns of navigation after being exposed to social features.
Step-wise regression analyses showed that other factors, including goal orientation, gender, and social factors were not
significant predictors of the non-sequentiality proportion in
Part 2. However, a repeated measures ANOVA on the proportion of non-sequentiality patterns at Part 1 and Part 2 found a
significant interaction of Time and Mastery-Approach group
(low/high), F (1, 27) = 8.135, p = .008, eta2p = .232. Figure 7 shows different patterns of the change of the proportion
of non-sequential navigation patterns among low and high
Mastery-Approach students.
We now analyze the change of Achievement-Goal and the
interaction with the Social factor. Since the literature on
Achievement-Goal suggests that Mastery and Performance
orientations can be influenced by environmental factors [10],
we are interested to see if the use of the system with social
comparison features, which are performance-oriented features, has any impact on the change of the goal orientation
of the students.
We performed a repeated-measures ANOVA on the initial and
final measures of Achievement-Goal orientation between the
OSM and OSSM groups. A significant effect on Time (initial, final) exists for all 4 achievement-goal factors, which all
consistently decreased during the term. A significant interaction was found for the Performance-Approach orientation
and Social factor, F (1, 50) = 7.506, p = .009, ηp2 = .131.
Students in the OSSM group showed lower decrease of the
Performance-Approach level than students of the OSM group
(Figure 6). These results suggest either that students who did
not decrease their Performance-Approach orientation are becoming engaged by the social comparison features, or that
social comparison features are influencing students positively
in their Performance orientation. Both of these explanations
have support in the achievement-goal literature, and further
research is needed to establish a causal relationship. It is
interesting to highlight that the Social factor presented no
interaction effect, nor a main effect on the change of other
Achievement-Goal factors like Mastery-Approach orientation. Even when the social comparison features might foster
performance orientation, they are not negatively influencing
the mastery orientation.
Effectiveness scores (see Section Effect in performance and
instructional effectiveness) measured in Part 1 and Part 2 did
not show any significant correlations with the AchievementGoal measures (middle). Repeated-measures ANOVA on the
effectiveness scores did not show any significant effect of interaction with any Achievement-Goal group (low/high). Regression analyses on effectiveness scores also did not find any
significant factor.
The value of peer-level comparisons
The Mastery Grids interface has a number of optional social features that were enabled for the students in the OSSM
group: changing the comparison mode through the group selector menu (number 7 in Figure 1), and loading the progress
of other students in the class (number 4 in Figure 1) by
clicking on the ‘load the rest of learner’ button in the interface. We analyzed Mastery Grids logs to see how much
students clicked on these features. Overall, out of the 34
logged-in OSSM students who logged in the system in Part
2, 18 students clicked at least once on the group menu or the
load learners button, with a median of 2.5 clicks (M ean =
Now, we look at the relations between Achievement-Goal
and the proportion of non-sequential navigation patterns (see
previous section for details about non-sequential navigation
measures). We consider the Achievement-Goal orientation
measures taken in the middle of the term. Only students who
answer the questionnaire, gave consent, and present at least
5 attempts to problems in Part 1 and at least 10 attempts to
problems in Part 2 are considered, resulting in 29 students,
15 in OSM group and 14 in OSSM group. A significant negative correlation between the proportion of non-sequential patterns and the Mastery-Approach orientation score was found,
159
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
4, M ax = 20). Although students used the social features
comparatively few times, there was a statistically significant
positive correlation between the number of clicks on these
features and overall engagement, (ρ = 0.41, p < .02). Those
who clicked on these features more had also more activities
on problems and examples. Of course, this correlation does
not imply causation. The results of the regression analysis
also showed that number of clicks on social features had a significant positive impact on the number of activities (β = 5.87,
SE = 2.25, p < 0.02), which can explain about 18% of the
variance (R2 = 0.18, F (1, 32) = 6.84, p < .02).
3 is about motivation. In Section 2, questions 1, 6, 7, and 8
are general questions about the importance that students give
to be able to compare or see others. Questions 2, 3 and 5 are
about the usefulness of OSSM features. Questions 4 and 9 are
about usability, and questions 10, 12, and 14 are about motivation on OSSM features. Questions 12 and 14 have reversed
scales, but are not shown reversed in the figure 8. Finally,
questions 11 and 13 ask students about the idea of showing
peer names. Some of the question refer to figures showing
the features that were included in the questionnaire and are
not reproduced here.
Among surveyed students, 63 answered Section 1 and between 20 to 26 students answered Section 2 (only within the
OSSM group). In general, the evaluation of the OSM interface, Section 1, is positive in terms of both usability and
usefulness. Students also agreed that Mastery Grids motivated them to work in problems and examples. The lower
score was given to question 6 (“The interface helped me to
plan my class work”), which might has been interpreted as
the overall class work plan. We may need to re-phrase the
question to stress the idea of planning the work within the
system. We further average usefulness questions, usability
questions, and motivation questions and test their differences
among low and high Achievement-Goal orientation (using the
middle measure of the achievement-goal survey). A KrustalWallis non-parametric test found a significant difference of
usefulness responses among low and high Mastery-Approach
students, χ2 = 5.699, p = .017. Other scores (such as
usability and motivation) and other achievement-goal measures did not result in significant differences. High MasteryApproach students were more positive towards the usefulness of the system. A similar analysis was performed for
Section 2 within the OSSM group. Questions were classified and 4 scores computed: value (questions 1, 6, 7, and
8), usefulness (2, 3, and 5), usability (questions 4 and 9),
and motivation (questions 10, 12 reversed, and 14 reversed).
A Krustal-Wallis non-parametric test found significant differences on the value score χ2 = 4.071, p = .044 among
Mastery-Approach groups. High Mastery-Approach students
value the interface more. A marginally significant difference
(p = .071) was also found for usability scores, with the same
trend: high Mastery-Approach students think the system is
more usable.
THE IMPACT OF INTERFACE COMPLEXITY
To see the value of the resource selector component in the
Mastery Grids system, we looked into logs of 45 students
who were in the resources-on (R) and social-resources (SR)
groups. In both groups, students were able to switch from
the default overall view that showed the average progress of
content in one row, to the full view that showed progress of
different content types in separate rows. For example, in the
social-resources (SR) group, the full and overall view were
similar to those shown in Figure 1 and 3, respectively.
In general, only 11 (24%) students clicked on the resource
selector menu, but they did not used it extensively. The average number of clicks on this menu was 1.54 (SD = 5.33,
M ax = 33). However, number of clicks on the resource
menu was positively correlated with the number of activities
that students viewed, ρ = 0.29, p < .06. Those who clicked
more on the resource menu had more problem attempts or
example views. This suggests that the presence of a feature
alone does not increase students tendency in using it, and
that there is a clear trade-off between having a complex interface with different views (the full vs overall view) and the
amount of engagement. Some students might become reluctant to continue working with a complex interface. On the
other hand, those who start using such an interface might find
it helpful and become even more engaged.
SURVEY ANALYSIS
A usability and usefulness questionnaire was designed to survey student opinion about the system and was applied at the
end of the term in both classroom studies. The questionnaire
has 2 main sections. Section 1 has 11 questions that refer to
the overall usability and usefulness of the OSM features of
the system. All surveyed students answered this part. Part
2 contains 14 questions about the usefulness and usability of
the OSSM features, which was answered only by the OSSM
group. All questions are in a 5-point Likert scale (Strongly
Disagree to Strongly Agree). For Section 1, the middle point
of the scale (3) is labeled as “No Opinion”. In section 2, the
middle point of the scale is labeled as “Did not notice”. The
questions, with their Mean and Standard Error in parentheses,
can be seen in Table 4.
CONCLUSIONS
This paper presented the Mastery Grids system, an intelligent
interface for accessing several kinds of practice content for an
introductory programming course. To address several known
obstacles to the broader use of valuable practice content, we
combined an open learner model, adaptive navigation support, and social comparison technologies. Based on the analysis of literature on learning motivation and self-regulated
learning, we expected OSM and social comparison features
to provide different motivational dimensions to the tool.
Figure 8 shows the distribution of answers. For the analysis
and in the figure, questions are grouped by usefulness, usability, and motivation questions. In Section 1, questions 1,
4, 5, 6, 8, 9, and 10 are about the usefulness of the OSM system. Questions 2, 7, and 11 are about usability, and question
The analyses of data from several classroom studies confirmed the overall engagement impact of our interface and
provided important insights about specific interface components such as social comparison and content-level progress
160
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
Figure 8. Distribution of responses of Sections 1 and 2 of the Mastery Grids Survey.
Table 4. Questions included in the Mastery Grids Survey. Some of the
questions refer to images contained in the survey that are not reproduced
here.
Section 1 questions
M (SE)
1
2
3
4
5
6
7
8
9
10
11
In general, it was useful to see my progress in
Mastery Grids
In general, I liked the interface of Mastery Grids
Seeing my progress in the tool motivated me to work
on quizzes and examples
The interface helped me to understand how the class
content is organized
The interface helped me to identify my weak points
The interface helped me to plan my class work
It was clear how to access problems and examples
It was useful to see my knowledge progress for each
topic (figure 1, A)
It was useful to see how I am doing with individual
quizzes and examples (figure 1, B)
The timeline (figure 1, C) was useful
Using green colors in different intensity to show my
progress was easy to understand
Section 2 questions
1
2
3
4
5
6
7
8
9
10
11
12
13
14
It is important for me to see the progress of the
rest of the class
It was useful to see the progress of the whole class as
it is represented in the Group row in Mastery Grids (A)
It was useful to see the progress of the top students as
it is represented in the Group row in Mastery Grids (A)
The comparison between the group and myself (B) is
easy to understand
It was useful to see the comparison between the selected
group and myself (B)
It is important for me to see the progress of individual
classmates (C)
In general, it is useful for me to be able to compare my
progress with the progress of others
It is important for me to see my position in the class
(D)
Visualizing the progress of others using blue colors of
different intensities was easy to understand
Viewing my classmates’ progress motivated me to work
more in quizzes and examples
I think it would be useful for me to know the names of
individual classmates in (C)
Viewing that others were more advance than me made
me want to quit using Mastery Grids
If names are shown, I will not like to show my name in
the list to others
Sometimes I just checked quizzes and examples to catch
up with the progress of others rather than to learn more
3.84 (0.11)
3.49 (0.13)
3.60 (0.13)
3.86 (0.12)
3.43 (0.14)
2.89 (0.13)
3.87 (0.12)
3.75 (0.13)
3.73 (0.13)
3.79 (0.11)
3.84 (0.14)
M (SE)
3.44 (0.19)
3.50 (0.18)
3.30 (0.19)
3.61 (0.18)
3.33 (0.18)
3.30 (0.20)
visualization. Most importantly, the study demonstrated a
significant positive impact of navigation-oriented social comparison interface on student engagement, efficiency, and effectiveness. A more detailed analysis also revealed interesting interactions of social comparison interface with motivational factors. While all achievement-goal orientations tend
to decrease during the term, students exposed (and engaged
by) social features in the system do not decrease their performance orientation. Student feedback indicated that students find the OSM and OSSM features useful and usable,
and that highly motivated students have more positive opinions of these features. These results provide sufficient ground
to recommend further exploration and practical use of social
comparison technology as a component of interfaces for accessing practice content.
On the other hand, the study has provided some evidence that
social comparison could negatively affect the diversity of student navigation, causing students to behave more alike one
another. To decrease the unifying effect of social comparison while retaining its engagement value, it might be advisable to combine social comparison and personalized recommendation technologies. Our pilot study of this combination
brought positive results [12]. In future work, we plan to further explore the value of OSM and OSSM in the context of
practice content access examining different kinds of OLM,
different approaches to present social comparison, and different combinations with other navigation support technologies.
ACKNOWLEDGMENTS
3.34 (0.22)
This work is supported by the Advanced Distributed Learning
(ADL) Initiative (contract W911QY-13-C-0032). The views
and conclusions contained in this document are those of the
authors and should not be interpreted as representing the official policies, either expressed or implied, of the Department
of Defense.
2.18 (0.17)
REFERENCES
3.44 (0.22)
3.53 (0.20)
3.94 (0.17)
2.71 (0.24)
3.91 (0.18)
2.79 (0.24)
161
1. Ames, C. Classrooms: Goals, structures, and student
motivation. Journal of educational psychology 84, 3
(1992), 261.
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
2. Bandura, A. Social foundations of thought and action: A
social cognitive theory. Prentice-Hall, Inc, 1986.
3. Brusilovsky, P., Edwards, S., Kumar, A., Malmi, L.,
Benotti, L., Buck, D., Ihantola, P., Prince, R., Sirki, T.,
Sosnovsky, S., Urquiza, J., Vihavainen, A., and
Wollowski, M. Increasing adoption of smart learning
content for computer science education. In Proceedings
of the Working Group Reports of the 2014 on Innovation
& Technology in Computer Science Education
Conference, ACM (2014), 31–57.
support in an e-learning system for java programming.
Journal of Computer Assisted Learning 26, 4 (2010),
270–283.
15. Levy, R. B.-B., Ben-Ari, M., and Uronen, P. A. The
jeliot 2000 program animation system. Computers and
Education 40, 1 (2003), 1–15.
16. Lindstaedt, S. N., Beham, G., Kump, B., and Ley, T.
Getting to know your user unobtrusive user model
maintenance within work-integrated learning
environments. In 4th European Conference on
Technology Enhanced Learning (ECTEL 2009),
U. Cress, V. Dimitrova, and M. Specht, Eds., vol. 5794
of Lecture Notes in Computer Science, Springer-Verlag
(2009), 73–87.
4. Brusilovsky, P., Somyürek, S., Guerra, J., Hosseini, R.,
and Zadorozhny, V. The value of social: Comparing
open student modeling and open social student
modeling. In User Modeling, Adaptation and
Personalization. Springer, 2015, 44–55.
17. Loboda, T. D., Guerra, J., Hosseini, R., and Brusilovsky,
P. Mastery grids: An open source social educational
progress visualization. In Open Learning and Teaching
in Educational Communities. Springer, 2014, 235–248.
5. Brusilovsky, P., and Sosnovsky, S. Individualized
exercises for self-assessment of programming
knowledge: An evaluation of quizpack. ACM Journal on
Educational Resources in Computing 5, 3 (2005),
Article No. 6.
18. Long, Y., and Aleven, V. Active learners: Redesigning
an intelligent tutoring system to support self-regulated
learning. In 8th European Conference on Technology
Enhanced Learning (EC-TEL 2013), D. Hernandez-Leo,
T. Ley, R. Klamma, and A. Harrer, Eds., vol. 8095 of
Lecture Notes in Computer Science (2013), 490495.
6. Brusilovsky, P., and Yudelson, M. From webex to navex:
Interactive access to annotated program examples.
Proceedings of the IEEE 96, 6 (2008), 990–999.
7. Bull, S., and Kay, J. Student models that invite the
learner in: The smili:() open learner modelling
framework. International Journal of Artificial
Intelligence in Education 17, 2 (2007), 89–120.
19. Mitrovic, A., and Martin, B. Evaluating the effect of
open student models on self-assessment. International
Journal of Artificial Intelligence in Education 17, 2
(2007), 121–144.
8. Bull, S., and Kay, J. Open learner models. In Advances
in intelligent tutoring systems. Springer, 2010, 301–322.
9. Bull, S., and Kay, J. Open learner models as drivers for
metacognitive processes. In International Handbook of
Metacognition and Learning Technologies. Springer,
2013, 349–365.
20. O’Keefe, P. A., Ben-Eliyahu, A., and
Linnenbrink-Garcia, L. Shaping achievement goal
orientations in a mastery-structured environment and
concomitant changes in related contingencies of
self-worth. Motivation and Emotion 37, 1 (2013), 50–64.
10. Ciani, K. D., Middleton, M. J., Summers, J. J., and
Sheldon, K. M. Buffering against performance
classroom goal structures: The importance of autonomy
support and classroom community. Contemporary
Educational Psychology 35, 1 (2010), 88–99.
21. Paas, F. G., and Van Merriënboer, J. J. The efficiency of
instructional conditions: An approach to combine
mental effort and performance measures. Human
Factors: The Journal of the Human Factors and
Ergonomics Society 35, 4 (1993), 737–743.
11. Elliot, A. J., and Murayama, K. On the measurement of
achievement goals: Critique, illustration, and
application. Journal of Educational Psychology 100, 3
(2008), 613.
22. Papanikolaou, K. A., Grigoriadou, M., Kornilakis, H.,
and Magoulas, G. D. Personalising the interaction in a
web-based educational hypermedia system: the case of
inspire. User Modeling and User Adapted Interaction
13, 3 (2003), 213–267. Questions: interaction between
goals. Sequencing? Self-choice? Al-la chapter?
12. Hosseini, R., Hsiao, I.-H., Guerra, J., and Brusilovsky, P.
What should i do next? adaptive sequencing in the
context of open social student modeling. In Design for
Teaching and Learning in a Networked World. Springer,
2015, 155–168.
23. Schunk, D. H. Goal setting and self-efficacy during
self-regulated learning. Educational psychologist 25, 1
(1990), 71–86.
24. Sirkiä, T., and Sorva, J. How do students use program
visualizations within an interactive ebook? In
Proceedings of the Eleventh Annual International
Conference on International Computing Education
Research, ICER ’15, ACM (New York, NY, USA,
2015), 179–188.
13. Hsiao, I.-H., Bakalov, F., Brusilovsky, P., and
König-Ries, B. Progressor: social navigation support
through open social student modeling. New Review of
Hypermedia and Multimedia 19, 2 (2013), 112–131.
14. Hsiao, I.-H., Sosnovsky, S., and Brusilovsky, P. Guiding
students to the right questions: adaptive navigation
162
IUI 2016 • Personalization
March 7–10, 2016, Sonoma, CA, USA
25. Sosnovsky, S., and Brusilovsky, P. Evaluation of
topic-based adaptation and student modeling in
quizguide. User Modeling and User-Adapted Interaction
25, 4 (2015), 371–424.
27. Wolters, C. A., Shirley, L. Y., and Pintrich, P. R. The
relation between goal orientation and students’
motivational beliefs and self-regulated learning.
Learning and individual differences 8, 3 (1996),
211–238.
26. Spacco, J., Hovemeyer, D., Pugh, W., Emad, F.,
Hollingsworth, J. K., and Padua-Perez, N. Experiences
with marmoset: Designing and using an advanced
submission and testing system for programming courses.
In 11th Annual Conference on Innovation and
Technology in Computer Science Education,
ITiCSE’2006, M. Goldweber and P. Salomoni, Eds.,
ACM Press (2006), 13–17.
28. Zimmerman, B. J. Self-regulated learning and academic
achievement: An overview. Educational psychologist
25, 1 (1990), 3–17.
29. Zimmerman, B. J. Self-regulating academic learning and
achievement: The emergence of a social cognitive
perspective. Educational psychology review 2, 2 (1990),
173–201.
163