SSEM Validation Pits
SSEM Validation Pits
SSEM Validation Pits
00(0), 2013
C 2013 Wiley Periodicals, Inc.
This article proposes a model of student school engagement, comprising aspirations, belonging,
and productivity. From this model, items for the Student School Engagement Measure (SSEM)
were developed. The SSEM was validated with data from 396 eighth graders in an urban school
district. Utilizing structural equation modeling, the second-order empirical model of the SSEM
was found to fit the data well, to have good reliability for the three factors, and to be predictive
of district-identified risk factors and state standardized academic assessment results. These results
suggest that the Student School Engagement Model and the SSEM may be useful tools for
understanding which students might be at increased risk for school dropout and how to intervene
to support school completion. Recommendations for practitioners and future research are given.
C 2013 Wiley Periodicals, Inc.
School psychologists and educators are increasingly interested in understanding the processes
by which students become engaged in, or disengaged from, school. A growing body of research
demonstrates the tragic outcomes of disengagement, as well as the possibilities for reshaping this
malleable construct by building on student strengths and contextual protective factors (Archambault,
Janosz, Fallu, & Pagani, 2009; Furlong & Christenson, 2008; National Research Council, 2004). This
article contributes to the abilities of school psychologists to provide strengths-based, population-
wide, data-driven services by reviewing the current engagement models, proposing a new model of
engagement, and providing the findings from the validation of a screening instrument to measure
engagement of secondary school students.
This research was partially funded by a grant from the University of Denver. The data presented are based on
the analysis in the doctoral dissertation of Dr. Vazirabadi. Thanks to Jennifer Albanes, Kelli Pfaff, Susan McDonald,
Christina Jack, and Rachel Wonner for their assistance in this work.
Correspondence to: Cynthia E. Hazel, Child, Family, and School Psychology, Morgridge College of Education,
University of Denver, 1999 East Evans Street, Denver CO 80208. E-mail: chazel@du.edu
1
2 Hazel, Vazirabadi, and Gallagher
The underlying premises of all definitions are that engagement is plastic (and therefore can be
changed and increased) and that higher levels of engagement result in improved academic perfor-
mance and increased likelihood of school completion for the student.
However, many nongraduates begin the process of disengaging and ultimately dropping out of
school much earlier than ninth grade (Finn, 1989; National Research Council, 2004). Identifying
students with low engagement, prior to high school, may provide opportunities to alter trajectories
and increase the probability that these students will complete their schooling.
M EASUREMENT OF E NGAGEMENT
The evolving terminology and definitions of engagement present a number of challenges to
the measurement of the construct (Appleton et al., 2008), not least of which is the differentiation
between a student’s engagement and the variables in the environment that support or hinder a
student from engaging in his or her schooling. Sometimes indicators of engagement (such as
attendance, credits earned, or academic competence) and facilitators of engagement (contextual
factors, such as discipline policies, parental supervision, and peer attitudes) are confused with
engagement (Furlong & Christenson, 2008). Although indicators and facilitators can be expected to
correlate with engagement, they do not capture a student’s engagement with school. For instance,
two students could have low attendance rates; for one student, this could be due to low engagement,
and for the other student, it could be due to illness. Both students are at increased risk for not
completing high school with their cohort should their attendance patterns continue; however, the
appropriate interventions to increase their attendance would be different.
Two models of student school engagement have been empirically validated. These models,
referred to in this article as the School Engagement Model and the Student Engagement Model, will
be reviewed, and a third model, termed the Student School Engagement Model, will be proposed.
M ETHOD
Participants
The participants were 396 eighth graders at three middle schools (73% of all eighth-grade
students at these schools) within an urban district in the central mountain region of the United States.
All questionnaire data were screened to determine the presence of outliers and missing data. Three
cases (1.4%) were missing more than 10% of the data; thus, they were eliminated from the analysis.
An additional 4 students were eliminated because they provided inaccurate identification and could
not be matched with records provided by the school district. After elimination of these students, the
sample size was 388 students; of these, 98% had 100% complete protocols. According to the district
records, most (80%) of the students were identified as Hispanic, 56% were male, 69% qualified for
free or reduced-price lunches, 18% qualified for English-language services, 6% had been identified
as gifted and talented, and 8% qualified for special education services.
Table 1
Student School Engagement Measure Initial Items, Items Remaining in the Empirical Model, and Their
Theorized and Empirical Factor Placements
Factor Placement
Table 1
Continued
Factor Placement
Note. A = Aspirations; B = Belonging; P = Productivity. a Items in bold were retained in the empirical validation. b Reverse
coded.
Student Outcome Measures. Existing district data were utilized to determine the extent to
which the SSEM domains were related to student outcome measures. The first outcome measure
was a categorized risk score, based on the eighth-grade behaviors of poor attendance (80% or lower),
discipline infractions (one suspension or more), failed language arts course, and failed math course.
A score of 0 indicated meeting none of the risk categories, whereas a 4 indicated the presence of
all four risk behaviors. The presence of these behaviors in eighth grade had been shown within the
district to indicate an increased risk of the student not graduating from high school in 4 years. The
second set of dependent variables was eighth-grade achievement on the state standardized academic
assessment of math, science, reading, and writing.
Table 2
Correlations, Descriptive Statistics, and Cronbach’s Alpha for the SSEM Empirical Factors
Note. SSEM = Student School Engagement Measure. a Range of 1 to 10. b Significant at p < .01.
R ESULTS
The following section discusses the psychometric properties of the various structural models of
the SSEM and then provides a model of the empirically supported SSEM with the student outcome
data. See Table 1 for a list of all items in the protocol, those that remained in the empirical model,
and the theoretical and empirical domains of the items.
Reliability of Items
Four items (Items 1, 34, 35, and 46) were excluded because the internal consistency coefficients
for their respective sub-domains were less than .7 prior to deletion. Two items (Items 23 and 26)
did not load significantly to the Student School Engagement construct and were eliminated. This
left 44 reliable items. For models with domains (the first- and second-order models), items that
cross-loaded onto more than one domain were eliminated using modification indices; this led to the
elimination of Items 2, 4, 10, 24, 36, 40, and 50. For the first- and second-order models, 37 reliable
items remained.
Table 3
Factor Loadings of Empirically Validated Items
3 .52
5 .52
8 .54
9 .80
12 .50
13 .50
16 .47
17 .69
18 .71
19 .80
20 .54
21 .57
25 .78
27 .51
28 .63
29 .55
30 .76
31 .50
33 .52
42 .67
45 .46
48 .79
(i.e., “I feel like a part of my school” and “I am proud to be a student at this school”); this factor was
labeled Belonging. See Table 3 for the retained items with their factor loadings.
Comparison of Models
Table 4 lists the fit indices for the single-factor model, the theoretical model, and the empirically
derived model.
Single-Factor Model. First explored was a single-factor model; see Table 4 for the fit indices.
The comparative fit index (CFI) was below the acceptable criterion of .90 or greater, the normed
chi-square was above the acceptable range of 2 to 3, and the root mean square of approximation
(RMSEA) was on the cusp of the acceptable range (below .08; Joreskog & Sorbom, 1984; Kline,
2005). Although all path coefficients were statistically significant and in the predicted direction, in
total, the single-factor solution indicated poorer fit than did the models next explored.
First-Order Theoretical Model. The next model tested was the first-order theoretical model,
illustrated in Figure 1. As shown in Table 4, the normed chi-square was within the acceptable
range, the CFI was below the acceptable criterion, and the RMSEA was within the acceptable
range. However, the absolute value of the highest standardized residual was 5.47, which was above
the acceptable limit and higher than for the second-order empirical model. Overall, the fit indices
suggested that the model did not fit the data as well as the empirical model (described next; Joreskog
& Sorbom, 1984).
Table 4
Fit Indices for the Second-Order Empirical, First-Order Theoretical, and Single-Factor Models of the SSEM
Second-Order Empirical Model. The second-order empirical model with standardized coeffi-
cients is illustrated in Figure 2. As seen in Table 4, the model fit the data well. The normed chi-square
was within the acceptable range, the CFI was above the acceptable criterion, and the RMSEA was
within the acceptable range (Joreskog & Sorbom, 1984). All indicator variables loaded highly and
significantly onto their respective constructs. Further, all paths from the second-order construct to
the first-order construct were statistically significant. Four items that had been theorized to belong
in the Aspirations domain were found empirically to belong in the Productivity domain (listed in
Table 1).
Criterion-Related Validity
The purpose of the criterion-related analysis was to validate the second-order empirical model
of the SSEM (the best-fitting model) with the criterion of student outcome measures (the state-
measured academic achievement scores and a district-developed composite risk score). See Figure 3
for a diagram of the model and the strength and direction of relationships. The model fit the
data well. The CFI was above the acceptable criterion of .90. The RMSEA was acceptable at .06
(Joreskog & Sorbom, 1984). The normed chi-square was 2.44, within the acceptable range. All
indicator variables loaded highly and significantly to their respective constructs. Furthermore, the
path coefficients between the SSEM and the student outcomes were statistically significant and in
the predicted directions.
D ISCUSSION
Engagement is a malleable component that may impact students’ school achievement. However,
research regarding what constitutes students’ engagement with school, the factors that contribute to
engagement, and how to measure engagement is still emerging. This study provided an alternative
conceptualization of engagement, the Student School Engagement Model, and a means to screen
students’ school engagement.
Once reliability of the SSEM items was established, three statistical models were compared.
The single-factor model did not fit the data well, suggesting that Student School Engagement is better
e11 TI11
e12 TI20
e13 TI29
.59
.79
e14 TI38
.75
e15 TI12
.79
.60
e16 TI21 .66
.68 Aspirations
e17 TI30 .46
.55
e18 TI39
.53
.61
e19 TI3
.58
e110 TI13
e111 TI22
TI14 e21
TI41 e23
.44
.56 TI47 e24
.64
.40 I5 e25
.63
.54 TI15 e26
.93
.74
Belonging .76
TI33 e27
.84
e32 TI17 TI43 e212
e311 TI28
e312 TI37
FIGURE 1. Confirmatory factor analysis results for the first-order theoretical model.
e11 I3
e12 I8
e13 I9
.56
.66
e14 I13
D1
.65
e110 I28
e111 I31
Student School
e112 I45
Engagement
D2
.85
e21 I12
.62
e22 I21 .70
.81 Aspirations
e23 I29 .77
.86
e24 I30
e31 I5
D3
.65
e32 I6
.51
e33 I25 .70
.79 Belonging
e34 I33 .79
.71
e35 I42
e36 I48
FIGURE 2. Confirmatory factor analysis results for the second-order empirical model.
defined with factors in addition to the global construct. This finding was in keeping with predominant
conceptualizations of engagement (Appleton et al., 2008), as well as many other instruments that
have been developed to measure engagement (Fredricks et al., 2011).
The second-order empirical model (comprising Student School Engagement as the first order
and the factors of Aspirations, Belonging, and Productivity as the second order) best fit the data.
This preliminary validation of the SSEM suggests that Student School Engagement is a multifaceted
construct measured through Aspirations, Belonging, and Productivity. This is a different model of
D5
e11 I3
Risk
e12 I8
e13 I9
.56
.66
e14 I13
.65 D1
.53
e15 I17
.81
e16 I18 .83
.73 Productivity -.23
e17 I19 .75
.68
e18 I20 .79
.59
e19 I27 .94
.64
e110 I28
e111 I31
Student School
e112 I45
Engagement
D2
.85
e21 I12
.62
e22 I21 .70
.81 Aspirations
e23 I29 .77
.86
e24 I30
.18
e31 I5
D3
.65
e32 I6
.51
e33 I25 .70
.79 Belonging
e34 I33 .79 D6
.71
e35 I42 e41 Math
.81
e42 Reading .93
e36 I48 Academic
.88
Achievement
e43 Science .87
e44 Writing
FIGURE 3. Structural model of the Student School Engagement Measure, district risk scores, and state standardized academic
achievement assessment results.
engagement than others, which have primarily divided engagement into some or all of the fol-
lowing dimensions: affective (or psychological), behavioral, cognitive, and academic (Christenson,
Reschley, & Wylie, 2012; Jimerson et al., 2003; O’Farrell, Morrison, & Furlong, 2006). The reliabil-
ity of the SSEM factors ranged from .83 to .92. This is at the upper range of what has been reported
for the SEM and SEI (range of .77 to .92).
When the second-order empirical model of Student School Engagement was analyzed for its
relationship to state-level academic achievement data and district-level risk data, SSEM scores were
shown to positively contribute to achievement on the state assessment of academic achievement
(.18) and protect against district-assessed risks (poor attendance, suspensions, and failure in math
or language arts; −.23). These results are in keeping with what has been found with other engage-
ment instruments. For example, correlations between factors on the SEI and middle school student
attendance and GPA ranged from not significant to .32 (Christenson, Reschley, & Wylie, 2012).
Correlations between the SEM and attendance and GPA ranged from not significant to .37 (Finlay,
2006).
The findings from this study suggest that the SSEM could be a useful predictive measure of
student risk behaviors and academic success. Because the SSEM is 22 items, it would be feasible
for practitioners to utilize the SSEM as a screener for students who might be at increased risk of
school disengagement.
Limitations
Although the SSEM showed promising preliminary results, there are limitations that must be
considered. First, this sample comprised solely eighth-grade students. The transition from middle to
high school has been shown to be a critical juncture in supporting students so that they may remain
on track for successful high school completion (Bruce et al., 2011), so understanding engagement
of eighth graders is extremely useful. However, it will be important to use the SSEM with students
from other secondary grades and see whether the measure proves as valid and reliable for them.
Further, the student participants were drawn from three schools in the same urban district.
How students in this district might differ in their engagement from students in other districts is
unknown. Because 40% of the families in the studied district report that they speak Spanish in the
home, the protocol was designed so that each item was listed simultaneously in English and Spanish.
This allowed students to use either or both languages to understand the items. However, due to
this design, there was no way to assess which language was used by participants and therefore, no
way to know whether the cultural issues associated with linguistic diversity had an impact on the
reliability or validity of the instrument. The students of this sample were predominantly identified
as Hispanic (80%) and poor (69% qualified for free or reduced-price lunches). These students were
representative of the demographics of this district and similar to student populations in other urban
districts, but not representative of the national student population. Poor and minority students in
urban districts are at increased risk for not completing high school (Chapman et al., 2010; Balfanz
et al., 2010), so understanding engagement in this population is critical. However, it will be important
to collect data from students in other districts to understand whether the SSEM is of similar utility
with non-Hispanic students, middle class students, and students in rural and suburban schools.
Another limitation was sample size. According to Costello and Osborne (2005), the sample
size should be large enough to have a subject-to-item ratio of 10 to 1 for exploratory analysis. This
study’s ratio was 8 to 1. Ideally, data are split in half for exploratory and confirmatory analysis. Due
to a relatively small sample size, the sample was not split into two groups for analysis. Future studies
with independent data sets are needed to further validate the model and instrument.
The SSEM was shown to be predictive of achievement on the state standardized assessment
and district-measured academic risk factors. This suggests that the SSEM is measuring a construct
of importance to students’ academic performance. However, many factors outside the construct
of engagement contribute to academic accomplishments. The SSEM is similar enough to other
measures of students’ engagement with school to suggest that it is measuring engagement, but the
convergent and discriminant validity of the SSEM needs to be empirically established.
R EFERENCES
Allensworth, E. M., & Easton, J. Q. (2007). What matters for staying on-track and graduating in Chicago Public High Schools:
A close look at course grades, failures, and attendance in the freshman year. Chicago, IL: Consortium on Chicago School
Research at the University of Chicago. Analysis of Moment Structures (Version 7) [Computer software]. Armonk, NY.
Analysis of Moment Structures (Version 7) [Computer software]. Armonk, NY.
Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and
methodological issues of the construct. Psychology in the Schools, 45, 369 – 386. doi: 10.1002/pits.20303
Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological
engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427 – 445.
doi: 10.1016/j.jsp.2006.04.002
Archambault, I., Janosz, M., Fallu, J., & Pagani, L. S. (2009). Student engagement and its relationship with early high school
dropout. Journal of Adolescence, 32, 651 – 670. doi: 10.1016/j.adolescence.2008.06.007
Aud, S., Hussar, W., Kena, G., Bianco, K., Frohlich, L., Kemp, J., & Tahan, K. (2011). The condition of education 2011
(NCES 2011–033). Washington, DC: National Center for Education Statistics.
Balfanz, R., Bridgeland, J. M., Moore, L. A., & Fox, J. H. (2010). Building a grad nation: Progress and challenge in ending
the high school dropout epidemic. Washington, DC: Civic Enterprises, Everyone Graduates Center at Johns Hopkins
University, and America’s Promise Alliance.
Betts, J. E., Appleton, J. J., Reschly, A. L., Christenson, S. L., & Huebner, E. S. (2010). A study of the factorial invariance of
the Student Engagement Instrument (SEI): Results from middle and high school students. School Psychology Quarterly,
25, 84 – 93. doi: 10.1037/a0020259
Bruce, M., Bridgeland, J. M., Fox, J. H., & Balfanz, R. (2011). On track for success: The use of early warning indicator and
intervention systems to build a grad nation. Washington, DC: Civic Enterprises and Everyone Graduates Center at Johns
Hopkins University.
Buhs, E. S., Ladd, G. W., & Herald, S. L. (2006). Peer exclusion and victimization: Processes that mediate the relation between
peer group rejection and children’s classroom engagement and achievement? Journal of Educational Psychology, 98,
1 – 13. doi: 10.1037/0022-0663.98.1.1
Chapman, C., Laird, J., & KewalRamani, A. (2010). Trends in high school dropout and completion rates in the United States:
1972–2008. Washington, DC: National Center for Educational Statistics.
Christenson, S. L. (n.d.). Department of Educational Psychology: Sandra L. Christenson. Retrieved from http://www.
cehd.umn.edu/edpsych/people/Faculty/Christenson.html
Christenson, S. L., Reschly, A. L., & Wylie, C. (2012). Handbook of research on student engagement. New York, NY:
Springer.
Christenson, S. L., Reschly, A. L., Appleton, J. J., Berman-Young, S., Spanjers, D. M., & Varro, P. (2008). Best practices
in fostering student engagement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology (Vol. 6, 5th ed.,
pp. 1099 – 1120). Bethesda, MD: National Association of School Psychologists.
Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations
for getting the most from your analysis. Practical Assessment, Research, and Evaluation, 10. Retrieved from
http://pareonline.net/getvn.asp?v=10&n=7
Dynarski, M., Clarke, L., Cobb, B., Finn, J., Rumberger, R., & Smink, J. (2008). Dropout prevention: A practice guide
(NCEE 2008–4025). Washington, DC: National Center for Education Evaluation and Regional Assistance.
Finlay, K. (2006). Quantifying school engagement: Research report. Denver, CO: National Center for School Engagement.
Finn, J. D. (1989). Withdrawing from school. Review of Educational Research, 59, 117 – 142. doi: 10.3102/
00346543059002117
Fredricks, J. A., Blumenfeld, P., Friedel, J., & Paris, A. (2003). School engagement. In K. A. Moore & L. H. Lippman (Eds.),
What do children need to flourish? Conceptualizing and measuring indicators of positive development (pp. 305 – 321).
New York, NY: Springer.
Fredricks, J. A., Blumenfeld, P., & Paris, A. (2004). School engagement: Potential of the concept, state of the evidence.
Review of Educational Research, 74, 59 – 109. doi: 10.3102/00346543074001059
Fredricks, J., McColskey, W., Meli, J., Montrosse, B., Mordica, J., & Mooney, K. (2011). Measuring student engagement
in upper elementary through high school: A description of 21 instruments. Greensboro, NC: Regional Educational
Laboratory Southeast.
Furlong, M. J., & Christenson, S. L. (2008). Engaging students at school and with learning: A relevant construct for all
students. Psychology in the Schools, 45, 365 – 368. doi: 10.1002/pits.20302
Furlong, M. J., Whipple, A. D., St. Jean, G., Simental, J., Soliz, A., & Punthuna, S. (2003). Multiple contexts of school
engagement: Moving toward a unifying framework for educational research and practice. California School Psychologist,
8, 99 – 113.
Gleason, P., & Dynarski, M. (2002). Do we know whom to serve? Issues in using risk factors to identify dropouts. Journal of
Education for Students Placed at Risk, 7, 25 – 41. doi: 10.1207/s15327671
Hazel, C. E., Wonner, R., & Jack, C. (2008, October). School engagement: What is it and how to get more of it. Paper
presented at the 33th annual convention of the Colorado Society of School Psychologists, Avon, CO.
Jimerson, S., Campos, E., & Greif, J. (2003). Towards an understanding of definitions and measures of school engagement
and related terms. California School Psychologist, 8, 7 – 28.
Joreskog, K. G., & Sorbom, D. (1984). Lisrel 6 user’s guide. Mooresville, IN: Scientific Software International.
Kline, R. B. (2005). Principles and practice of structural equation modeling (2nd ed.). New York, NY: Guilford.
Kortering, L, & Braziel, P. (2008). Engaging youth in school and learning: The emerging key to school success and completion.
Psychology in the Schools, 45, 461 – 465. doi: 10.1002/pits.20309
Lewin, K. (1943/1997). Behavior and development as a function of the total situation. In D. Cartwright (Ed.), Field theory in
social science: Selected theoretical papers. Washington, DC: American Psychological Association.
National Research Council, Committee on Increasing High School Students’ Engagement and Motivation to Learn. (2004).
Engaging schools: Fostering high school students’ motivation to learn. Washington, DC: National Academies Press.
O’Farrell, S. L., Morrison, G. M., & Furlong, M. J. (2006). School engagement. In G. G. Bear & K. M. Minke (Eds.),
Children’s needs III: Development, prevention, and intervention (pp. 45 – 58). Bethesda, MD: National Association of
School Psychologists. Statistical Package for Social Sciences (Version 19.0) [Computer software]. Armonk, NY.
Statistical Package for Social Sciences (Version 19.0) [Computer software]. Armonk, NY.
Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics (4th ed.). Boston, MA: Allyn & Bacon.
Watson, D. (1992). Correcting for acquiescent response bias in the absence of a balanced scale: An application to class
consciousness. Sociological Methods and Research, 23, 52 – 88. doi: 10.1177/0049124192021001003
Willis, G. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage.