March 2016
Widening gaps:
What NAPLAN tells us about student progress
Peter Goss and Julie Sonnemann
Widening gaps: what NAPLAN tells us about student progress
Grattan Institute Support
Founding Members
Grattan Institute Report No. 2016-3, March 2016
Program Support
Higher Education Program
This report was written by Dr Pete Goss, Grattan Institute School Education
Program Director and Julie Sonnemann, School Education Fellow. Jordana
Hunter, School Education Fellow, Cameron Chisholm, Senior Associate and
Lucy Nelson, Associate, provided extensive research assistance and made
substantial contributions to the report.
We would like to thank the members of Grattan Institute’s School Education
Program Reference Group for their helpful comments, as well as numerous
industry participants and officials for their input.
Affiliate Partners
Google
Origin Foundation
Medibank Private
Senior Affiliates
EY
PwC
The Scanlon Foundation
Wesfarmers
Affiliates
Ashurst
Corrs
Deloitte
GE ANZ
Urbis
Westpac
Grattan Institute 2016
The opinions in this report are those of the authors and do not necessarily
represent the views of Grattan Institute’s founding members, affiliates, individual
board members reference group members or reviewers. Any remaining errors or
omissions are the responsibility of the authors.
Grattan Institute is an independent think-tank focused on Australian public
policy. Our work is independent, practical and rigorous. We aim to improve
policy outcomes by engaging with both decision-makers and the community.
For further information on the Institute’s programs, or to join our mailing list,
please go to: http://www.grattan.edu.au/
This report may be cited as:
Goss, P., Sonnemann, J., Chisholm, C., Nelson, L., 2016, Widening gaps: what NAPLAN tells us
about student progress, Grattan Institute
ISBN: 978-1-925015-82-9
All material published or otherwise created by Grattan Institute is licensed under a Creative
Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License
Widening gaps: what NAPLAN tells us about student progress
Overview
NAPLAN – Australia’s first national test of literacy and numeracy –
is a powerful tool. It allows policymakers to measure students’
achievement in core literacy and numeracy skills. It provides data
on the progress students make as they move through school.
These gaps matter. Achievement in Year 9 is a strong predictor of
success in study and work later on. A good school education
helps a young person stand on their own two feet as an adult, and
the benefits ripple through future generations.
But it is hard to compare different groups of students using the
NAPLAN scale. If students in remote areas score 40 NAPLAN
points below their inner-city peers, what does this mean? Are they
one year behind, or two? Does a 40-point gap even mean the
same thing in Year 7 as it does in Year 5 or 9?
Our findings use a new time-based measure, ‘years of progress’,
which makes it easier to compare different groups of students.
Rather than say a group of Year 5 students scored 540 in
NAPLAN, we can say they achieved two years ahead of their
peers.
The way we measure learning progress is vitally important.
Without meaningful comparisons, we can lose sight of how far
behind some students really are.
This resembles the approach used in cycling road races, where
gaps between riders are measured in minutes and seconds, not
metres. Time gaps between riders are more meaningful than
distance if some are on the flat, while others are grinding up a hill.
New analysis in this report shows that learning gaps widen
alarmingly as students move through school. By Year 9, the
spread of achievement spans eight years. NAPLAN’s minimum
standards are set too low to identify the stragglers. A Year 9
student meets the minimum standard even if they are reading
below the level of a typical Year 5 student.
Many of those falling behind have parents with low levels of
education. The gap between children of parents with low and high
education grows from 10 months in Year 3 to more than two years
by Year 9. Even if they were doing as well in Year 3,
disadvantaged students make one to two years less progress.
Bright kids in disadvantaged schools show the biggest losses.
Importantly, the learning gaps grow much larger after Year 3.
Disadvantaged students are falling further behind each year they
are at school, on our watch.
Grattan Institute 2016
The new measure does not mean the NAPLAN scale has to
change: indeed, it relies on NAPLAN. But it does make the data
easier to interpret. It also allows policymakers to compare
students’ progress at different stages in their learning.
Policymakers can identify which groups of students are making
slow progress, and set system-wide priorities accordingly.
Policymakers should act on these findings. Student progress and
learning gaps should be put at the centre of education policy. In
light of the large spread in achievement, policymakers should give
schools better support to target teaching to each child’s needs.
And, given the very large gaps, policy leaders must work harder to
improve the progress of disadvantaged students so that every
child in every school can achieve their potential.
1
Widening gaps: what NAPLAN tells us about student progress
Key findings
Converting NAPLAN data into years of progress provides striking
insight into relative learning progress between Year 3 and Year
9. While most findings are based on Victorian students, the
patterns of widening gaps have national relevance. 1
Educationally disadvantaged students (Chapter 4)
Students of parents with low education fall very far behind.
The gap to students whose parents have a degree is ten months
in Year 3 but two and a half years by Year 9.
The spread of student achievement (Chapter 3)
The spread of student achievement more than doubles as
students move through school in Australia. The middle 60 per
cent of students in Year 3 are working within a two-and-a-half
year range. By Year 9, the spread for these students is five-and-ahalf years. The top ten per cent of students are about eight years
ahead of the bottom ten per cent.
Most of this learning gap develops between Year 3 and Year
9, not before Year 3. The gap that exists in Year 3 (ten months)
triples by Year 9 (thirty months).
Even when capabilities are similar in Year 3, disadvantaged
students fall between 12 months and 21 months behind more
advantaged students by Year 9.
NAPLAN national minimum standards (NMS) are set very low.
A Year 9 student can meet NMS even if they are performing
below the typical Year 5 student. They can be a stunning four
years behind their peers.
These patterns play out geographically. Students in low socioeconomic areas start behind, and make less progress in school.
Many regional and rural students make up to two years less
progress than students in inner city areas between Year 3 and 9.
Low achieving students fall ever further back. Low achievers
in Year 3 are an extra year behind high achievers by Year 9. They
are two years eight months behind in Year 3, and three years
eight months behind by Year 9.
Students who attend disadvantaged schools (Chapter 4)
1
Preliminary analysis suggests that most patterns in the Victorian data are
evident nationally. Data includes both government and non-government schools.
Grattan Institute 2016
Students in disadvantaged schools make around two years
less progress between Year 3 and Year 9 than similarly capable
students in high advantage schools.
Bright students in disadvantaged schools show the biggest
learning gap. High achievers in Year 3 make about two-and-ahalf years less progress by Year 9 if they attend a disadvantaged
school rather than a high advantage school. In fact, high
achievers in disadvantaged schools make less progress than low
achievers in high advantage schools over the six years.
2
Widening gaps: what NAPLAN tells us about student progress
Summary of policy recommendations
Recommendation 1: Put analysis of relative student progress
and learning gaps at the centre of the policy agenda and use
it to target policy and resources more effectively
Recommendation 3: Given the very large gaps that open up
by Year 9, increase efforts to lift the progress of
disadvantaged students
1a. Policy makers should adopt Grattan’s new ‘years of progress’
approach to better understand relative student progress and
learning gaps.
3a. Make it a priority to increase the rate of learning progress of
educationally disadvantaged students, especially low
performers. Start early but also provide ongoing support:
1b. Use analysis of relative student progress to inform system
priorities, resource allocation and needs-based funding
policies.
1c. Education departments should continue to link up student
data, and implement a national student identification
mechanism.
Recommendation 2: In light of the very large spread in
student achievement, implement better systematic support
for targeted teaching so that all students make good learning
progress, regardless of their starting point
2a. Strengthen system-wide policies around targeted teaching
and provide practical support, with an emphasis on giving
teachers time, tools and training.
2b. Either raise the NAPLAN national minimum standard or
remove it entirely. Lift the bar to focus on proficiency.
Grattan Institute 2016
•
give all students at least one year of quality pre-primary
education
•
target teaching from the first week of primary so students
have strong foundational skills by the end of Year 3
•
continue to support progress after Year 3, providing
remedial support as early as possible
•
involve various government and non-government bodies
•
given new findings, do more analysis that isolates the
impact of schools to identify what works and why.
3b. Strengthen support for bright students whose parents have
low levels of education.
3c. As a priority, the Education Council should initiate and
oversee a coordinated national review of the quality and
effectiveness of school education for disadvantaged students.
3
Widening gaps: what NAPLAN tells us about student progress
Table of contents
Overview .....................................................................................................1
Key findings ................................................................................................2
Summary of policy recommendations .........................................................3
List of Figures .............................................................................................5
1
Measuring student progress is important ..............................................6
2
A new way to compare student progress using NAPLAN data ...........12
3
The spread in achievement widens dramatically as students progress
through school....................................................................................18
4
Students whose parents have low education fall very far behind ........25
5
Closing the gaps would generate big economic benefits ....................33
6
What policymakers should do .............................................................40
Glossary....................................................................................................48
Appendices ...............................................................................................50
References ...............................................................................................57
Grattan Institute 2016
4
Widening gaps: what NAPLAN tells us about student progress
List of Figures
Figure 1: Higher performing countries have fewer low achieving and more high achieving students than Australia ................................................................ 7
Figure 2: NAPLAN scale scores suggest remote students are closing the gap, but the gap in years shows the opposite .................................................... 11
Figure 3: In cycling, it is better to estimate gaps using time rather than distance ................................................................................................................... 12
Figure 4: NAPLAN scale scores are converted to equivalent year levels along the estimated student growth trajectory ...................................................... 13
Figure 5: NAPLAN scale scores suggest the spread stays constant, but equivalent year levels shows it is increasing......................................................... 19
Figure 6: Many students are performing several years ahead or behind the median for their year level ................................................................................ 19
Figure 7: NAPLAN gain scores can be misinterpreted to suggest low achievers in Year 3 start catching up ......................................................................... 21
Figure 8: In fact, low achievers fall further behind by Year 9 if equivalent year levels are used ............................................................................................. 21
Figure 9: National minimum standards are set very low .......................................................................................................................................................... 23
Figure 10: The gap between students with low and high levels of parental education grows alarmingly between Year 3 - 9 ................................................ 26
Figure 11: From the same Year 3 score, students of parents with low education make much less progress to Year 9 ......................................................... 27
Figure 12: Fewer low-SES Australian students perform at the highest levels of achievement than a decade ago ................................................................ 28
Figure 13: Students in disadvantaged schools fall very far behind between Year 3 and Year 9 ............................................................................................. 29
Figure 14: From the same Year 3 score, students in disadvantaged schools make much less progress to Year 9 ............................................................... 30
Figure 15: Inner-city students make the most learning progress ............................................................................................................................................. 32
Figure 16: Average school advantage is higher in inner-city areas ......................................................................................................................................... 32
Figure 17: New Zealand analysis shows the high cost to the state of individuals with no formal qualifications ...................................................................... 35
Figure 18: Metropolitan students gain more points between NAPLAN tests than remote students with the same starting score .......................................... 50
Figure 19: Learning curves and percentiles, numeracy ........................................................................................................................................................... 54
Figure 20: Learning curves and percentiles, reading ............................................................................................................................................................... 54
Figure 21: In a typical school, equivalent year levels show that the spread of achievement is increasing ............................................................................. 55
Figure 22: Gain scores could be interpreted to show low achievers whose parents have low education making more progress .......................................... 56
Figure 23: Our new measure shows the opposite: high achievers from whose parents have high education make more progress ...................................... 56
Grattan Institute 2016
5
Widening gaps: what NAPLAN tells us about student progress
1
Measuring student progress is important
1.1
Literacy and numeracy matter
The literacy and numeracy skills students attain by Year 9 will
substantially affect their life outcomes. Low achievement can limit
options for further study and work later on. 2 Poor educational
results are linked with higher risks of unemployment and lower
lifetime earnings. 3
of education, which are linked to greater tolerance towards people
of different cultures and social cohesion generally. 5
Making good progress at school is just as relevant for high
achievers. These students have potential to reach great heights,
and Australia’s ability to innovate depends on them.
Box 1: New NAPLAN findings around cycles of disadvantage
Low achievement at school can be part of a cycle of intergenerational disadvantage. A student whose parents are poorly
educated, unemployed or of low occupational status is less likely
to do well at school, as discussed in Box 1. 4 Low achievement in
turn reduces a student’s chances of completing secondary school
and obtaining a tertiary education, and affects future employment
prospects. Down the track, adults’ own low levels of education
affect the learning outcomes of their children. The cycle goes on.
A quality education enables all individuals to improve their socioeconomic situation on the basis of merit, not circumstance. An
effective education system maximises the potential of every
student. It sets and supports high expectations for all learners.
Successful schooling is vital for low achievers, who will struggle in
life if they do not build strong educational foundations in school.
Education has been shown to impact positively on self-reported
health outcomes as well as community engagement and
likelihood of volunteering. Society also benefits from higher levels
2
OECD (2014a), p. 252
Leigh (2010); Cassells, et al. (2012); ABS (2014a)
4
ABS (2014b)
While many studies show that student achievement is strongly
related to parental education, occupation and employment status,
there is little research done in Australia using National
Assessment Program – Literacy and Numeracy (NAPLAN) data.
However one recent study by the Australian Bureau of Statistics
(ABS) on Tasmanian students shows disturbing patterns:
•
Students with no parent employed are more than twice as
likely to achieve below NAPLAN national minimum standards
than those with at least one parent employed.
•
Children of workers in less skilled occupations are five to
seven times more likely to achieve below national minimum
standards than children of more highly skilled parents.
•
Most Tasmanian students who achieve at or below NAPLAN
national minimum standards left school early. Four years later,
about one in three were not engaged in work or study.
Source: ABS (2014a)
3
Grattan Institute 2016
5
Grattan Institute report Norton (2012)
6
Widening gaps: what NAPLAN tells us about student progress
1.2
Our school system must do more for low and high
achievers
Low parental education and other social factors can hold students
back, but a young person’s background should not determine their
future. Schools across Australia are working with parents,
communities and schools systems to break the cycles of
disadvantage. Some are having real and sustained success.
Yet Australia can do a lot more to lift the performance of every
student, particularly those at the bottom and top end of student
achievement. Almost every higher performing country in the
Organisation for Economic Co-operation and Development’s
(OECD) Programme for International Student Assessment (PISA)
has fewer low achievers and more high achievers than Australia
at age 15 (see Figure 1).
Further, Australia’s performance in these areas has worsened
between 2003 and 2012. Australia’s proportion of low performers
in mathematics grew by a third, while our proportion of high
achievers dropped by a quarter. 6 This trend is indicative of a
broader overall decline in Australia’s PISA track record. Since
2000 and 2003, Australia’s overall performance dropped by a
significant 16 PISA points in literacy and 20 PISA points in
mathematics.
But change is possible. Poland is a good example where
impressive progress has been made. Between 2003 and 2012,
Poland increased the proportion of high performers in PISA
mathematics, reduced the proportion of low performers, and
increased its average by 27 points. 7
6
7
See OECD (2014a) p. 70 Figure I.2.23.
Ibid. Poland now outperforms Australia in maths, and is on par in literacy.
Grattan Institute 2016
Figure 1: Higher performing countries have fewer low achieving
and more high achieving students than Australia
Percentage of students by PISA proficiency level, mathematics, 2012
40
20
0
Shanghai-China
55.42335095
Singapore
Hong Kong-China
Chinese Taipei
Korea
Macao-China
High
Japan
achievers
Liechtenstein
Switzerland
Netherlands
Estonia
Finland
Canada
Poland
Low
Belgium
achievers
Germany
Vietnam
Austria
42%
15%
Australia
Ireland
Slovenia
13.71346843
Denmark
New Zealand
0
20
40
Notes: Countries ordered by mean score in PISA in maths 2012. Low proficiency below
level 1, level 1 and level 2, and high proficiency levels 5 and 6 in PISA.
Source: OECD (2014b), Table I.2.1a and Table I.2.3a
7
Widening gaps: what NAPLAN tells us about student progress
1.3
Success at school is all about learning progress
The best way to improve achievement is to focus on individual
learning progress. 8 Understanding student learning growth, not
just achievement, is important. 9 Student progress measures tell
us how much students improve from one year to the next.
Students who fall behind will never start to close the gap unless
their rate of learning accelerates.
When policymakers can track student progress they can see
which groups of students are thriving and which are struggling. It
helps them answer relative questions such as:
•
Are low achieving students catching up to their peers?
•
Are high achievers being stretched enough?
•
How do different groups of students progress with time, for
example, with different family circumstance, gender, or
geographical location?
•
Is progress different in different types of schools, and at
specific stages of schooling?
Comparing how some students progress relative to others is an
important lens. Without relative comparisons, we can lose sight of
how far behind some students are. All students operate on the
same playing field for further study and work once they leave
school. Those who fail to reach their potential can miss out on
important opportunities in life.
8
9
See Grattan Institute’s report Targeted Teaching, Goss, et al. (2015).
See Grattan Institute’s report Student Progress, Jensen (2010).
Grattan Institute 2016
Measuring student progress is important as it enables
policymakers to see how students are progressing across the
system. This data should influence how priorities are set, and
where resources are allocated. Those who are making the least
progress, or those who are failing to reach their potential, should
be the focus of our policy efforts.
1.4
NAPLAN is a great first step towards analysing
progress
NAPLAN data opens up unprecedented opportunities to
understand student progress. It is the first national longitudinal
comprehensive dataset of its kind in Australia, and one of the few
in the world.
Since 2014, NAPLAN data has become available for full cohorts
of students who have completed all four tests: Year 3, Year 5,
Year 7 and Year 9. 10 We can now track how students perform
from Year 3 to Year 9 to assess whether their progress is
adequate given early indications of potential.
A key feature of NAPLAN is that scores can be compared across
tests sat in Years 3, 5, 7 and 9, and over time, through a common
scale. For example, a student who took the Year 5 NAPLAN
reading test in 2012 and scored 500 is assumed to be reading at
10
There are now two full sets of NAPLAN student cohort data, for students who
completed Year 3 to Year 9 between 2008-14 and 2009-15.
8
Widening gaps: what NAPLAN tells us about student progress
Box 2: What is NAPLAN?
In 2008 the National Assessment Program – Literacy and
Numeracy (NAPLAN) was introduced as an annual test for Year
3, 5, 7 and 9 students. Testing covers four domains: Reading,
Writing, Language Conventions (spelling, grammar and
punctuation) and Numeracy. 11 It provides a standardised measure
of student achievement around the country.
The test provides each student with a NAPLAN ‘scale score’,
which is an estimate of student ability at a given point in time.
Scale scores typically range from 0 to 1000, and are organised
into 10 NAPLAN proficiency bands.
From 2017, NAPLAN Online will be introduced. It will include
adaptive ‘branch testing’ where the difficulty of questions are
adjusted depending on whether students are struggling or underchallenged. Through more precise testing, this feature helps elicit
more accurate information on what students can do. The results
of NAPLAN online will also be available to teachers sooner after
the test.
For policymakers and researchers, one of the big benefits of
NAPLAN Online is that the measurement error will decrease,
especially for low performing and high performing students. The
adaptive testing process means that more of the questions faced
will be at an appropriate level. Currently, most questions in
NAPLAN are aimed at the middle, rather than the top or bottom.
the equivalent level 12 of a student who took the Year 7 reading
test in 2013 and received the same scale score. 13
NAPLAN does not test everything, but the things it does test
matter. A study by the ABS shows that NAPLAN scores in Year 9
are a strong predictor of high school completion as well as
success after school in study and work. 14 This means NAPLAN
data can now be used to identify certain groups of students who
are struggling early on in school, before low performance
becomes entrenched.
Importantly, NAPLAN can also help show which policies and
practices are working, and whether system settings are right.
1.5
But it is not easy to compare progress with existing
NAPLAN measures
While NAPLAN provides invaluable data, something has been lost
in policy discussions around student progress using NAPLAN
scale scores. If remote Year 7 students are 40 NAPLAN points
behind their metropolitan peers, what does this actually mean?
Are they one year behind, or two years, or more? And does 40
points behind at Year 7 mean the same thing as at Year 5 or 9?
This is not a technical quibble: without meaningful comparisons,
we lose sight of how far behind some students really are.
12
11
ACARA (2013b)
Grattan Institute 2016
That is, the students are demonstrating equivalent skills in the areas tested by
NAPLAN. Whenever we talk about student achievement in this report it is in
reference to the skills tested in NAPLAN.
13
This assumption becomes more problematic for very high or very low scores,
since the number of relevant questions is small and measurement error is high.
14
ABS (2014a); see also Houng and Justman (2014)
9
Widening gaps: what NAPLAN tells us about student progress
It would be easy to make these comparisons if students gained
NAPLAN scores at a steady pace as they moved through school.
But they do not. The Australian Curriculum, Assessment and
Reporting Authority (ACARA) notes that:
students generally show greater gains in literacy and
numeracy in the earlier years than in the later years of
schooling, and that students who start with lower NAPLAN
scores tend to make greater gains over time than those who
start with higher NAPLAN scores. 15
NAPLAN is a very sophisticated testing system, yet this non-linear
growth curve makes it hard to compare gaps between different
groups of students, or their learning progress. It is especially
difficult to compare students of different backgrounds, who are
likely to be at very different scores on the curve (in other words, at
different stages of their learning), even though they are the same
age and in the same year level.
1.6
NAPLAN gain scores do not show the full picture
‘Gain scores’ are the difference in NAPLAN scale scores between
two points in time. They measure student progress in NAPLAN
points, but need to be interpreted very carefully. 16
15
ACARA (2016a) p.5.
For example, a group of students with a 30 point gain over two years could be
falling relative to their peers (their percentile ranking in the population
decreasing), keeping pace (percentile ranking steady) or advancing (percentile
ranking increasing). Without knowing the NAPLAN starting score it is impossible
to know. The Victorian Curriculum and Assessment Authority (VCAA) ‘relative
growth measure’ is designed to address just this issue VCAA (2012). For a
description of the various measures used in NAPLAN data see Appendix 2.
In particular, gain scores have limitations when policymakers want
to compare different groups of students from different starting
points (i.e. answer questions of relative progress). In these cases,
a face-value interpretation of gain scores can suggest students
are catching up when they are actually falling further behind. 17
The challenges can be illustrated using a real example, by
comparing the progress of kids from the bush with kids from the
city. Figure 2 shows two charts with identical data comparing the
progress of remote and metropolitan students between Year 3
and Year 9. The chart on the left hand side shows the gap in gain
scores, the chart on the right hand side shows the gap in time.
In NAPLAN points, the gap between remote students and metro
students decreases with time, from 56 NAPLAN points in Year 3
to 38 points in Year 9. Looked at in another way, as shown in the
table at the bottom of Figure 2, remote students make larger gains
in NAPLAN between Year 3 and Year 9 (+185 points) than
metropolitan students (+168 points).
But this should not be misinterpreted to mean that remote
students are catching up to metropolitan students in a broader
learning sense. Looking at the gap in years and months of
learning (right hand side), it is clear that this gap gets wider over
time. Remote students are 1 year 3 months behind in Year 5, and
this gap grows to 2 years behind by Year 9. They are falling
further behind. 18
16
Grattan Institute 2016
17
‘Face-value interpretation’ refers to an interpretation that more gain points
means better learning progress in a broader sense than NAPLAN points.
18
This interpretation can be confirmed by looking at the average gain scores for
remote and metropolitan students with the same Year 3 scores. Metro students
consistently gain more points between successive NAPLAN tests than
10
Widening gaps: what NAPLAN tells us about student progress
Figure 2: NAPLAN scale scores suggest remote students are
closing the gap, but the gap in years shows the opposite
NAPLAN scale scores, reading, Australian students, 2014
600
Gap in scale points
600
Metropolitan
2y
Remote
500
In Chapters 3 and 4, the new approach is applied to the data,
revealing a striking picture of student performance. There is a
remarkably wide spread of achievement in every year level, and a
learning gap that widens between Year 3 and Year 9. Students
whose parents have low levels of education are much further
behind than most people may realise.
1y
3m
56
400 pts
400
Year 3
Year 5
300
Gain scores
Metro
Remote
Chapter 2 proposes a new way to use NAPLAN data to compare
the progress made by very different groups of students.
1y
6m
500
47
pts
Year 7
Year 9 Year 3
300
How this report is structured
This chapter emphasises why a new measure of relative student
progress in NAPLAN is needed, and how this can change what
we see in the results.
Gap in time
38
pts
40
pts
1.7
Year 5
Year 7
Yr 3 − 5 Yr 5 − 7 Yr 7 − 9 Yr 3 − 9
+77
+51
+40
+168
+86
+58
+41
+185
Year 9
Chapter 5 discusses the loss to individuals and the economy from
the dramatic learning gaps that open up between Years 3 and 9.
Chapter 6 summarises our policy recommendations.
Notes: points on both charts are identical.
Source: Grattan analysis of ACARA (2014b)
comparably capable remote students, whatever the starting score (chart
provided in Appendix 1).
Grattan Institute 2016
11
Widening gaps: what NAPLAN tells us about student progress
2
A new way to compare student progress using NAPLAN data
This chapter establishes a new time-based measure, years of
progress, to compare relative student performance. The measure
estimates what a year of learning progress looks like on the
NAPLAN scale.
2.1
It’s time that matters most, not distance
Imagine a cycling road race. To gauge the gap between a rider
and the main pack, we talk about minutes and seconds, not
distance. That’s because while a gap of 100 metres might not look
like much, it really depends on the terrain. On the flat it might take
10 seconds; on a hill it might take 30 seconds (see Figure 3).
Figure 3: In cycling, it is better to estimate gaps using time rather
than distance
from a score of 550 to 600. It is as though the NAPLAN ‘road’ gets
steeper as students learn more. 19
To extend the cycling analogy: students at low achievement levels
in NAPLAN are on the flat and riding fast (big gain scores), while
those at high achievement levels are on a steep hill and riding
slowly (small gain scores). But riding faster on a flatter road does
not necessarily mean riding better. When those on the flat hit the
hills, they too will slow down. So distance alone does not tell us
how well a rider is really doing; more information is needed.
This non-linearity makes it hard to compare the relative progress
of different groups of students, since both gain scores (speed)
and prior NAPLAN scores (terrain) need to be taken into account.
2.2
10 seconds
100 metres
Differences in NAPLAN scores are like a measure of distance
rather than time. This would not matter if growth along the
NAPLAN scale was steady. But it is not. For example, it typically
takes less time to go from a score of 400 to 450 in NAPLAN than
Grattan Institute 2016
Benchmarking progress to the typical student
To address this limitation we create a new measure, years of
progress, which benchmarks student performance in NAPLAN to
the typical student. It allows us to see if students are catching up
or falling further behind relative to others. 20
For example, instead of saying that a group of Year 5 students
are achieving at a NAPLAN score of 540, we can now say they
are achieving in Year 5 what the typical student would achieve in
19
This pattern is consistent across NAPLAN domains, year levels and for
students from different backgrounds.
20
To limit the effect of measurement error, both this discussion and our
proposed methodology focus on large groups of students rather than individual
students.
12
Widening gaps: what NAPLAN tells us about student progress
Year 7. In other words, they are two years in front of the typical
Year 5 student.
To create the new approach, we use national NAPLAN data to
estimate the growth trajectory of the typical student (Figure 4). 21
NAPLAN scale scores are mapped onto the typical student’s
growth pathway across the schooling years. Based on this curve,
we define a first measure:
Figure 4: NAPLAN scale scores are converted to equivalent year
levels along the estimated student growth trajectory
Estimated median NAPLAN scale score (NSS) by year level, numeracy,
Australian students, 2014
NSS
EYL
Year 11
600
Year 9
585
Year 7
1. Equivalent year level (EYL): the year level in which the typical
student would be expected to achieve a given NAPLAN score.
540
500
Year 5
489
By comparing two NAPLAN scores in this way, we can deduce a
second measure:
400
2. Years of progress: the years and months of learning it would
take the typical student to move from one NAPLAN score to
another. It estimates the difference between equivalent year
levels at two different points in time.
Table 1 summarises how our two new measures relate to existing
NAPLAN measures. 22 More detail on the methodology and
assumptions are described in the Technical Report.
Year 3
402
Year 2
300
0
1
2
3
4
5
6
7
8
Year level
9
10
11
Source: Grattan analysis of ACARA (2014b), national data.
Table 1: Two new NAPLAN measures proposed in this report
Concept
21
The curve is anchored using the observed median student achievement in
Years 3, 5, 7, and 9. The curve is smoothed between these points. Outside of
Year 3 to 9 the curve is estimated using a regression based on students who fall
outside of the median Year 3 and Year 9 scores. Data from all states and
territories is used. See the Technical Report for details. Goss and Chisholm
(2016)
22
Further information is available in the Appendices on; the conversion of
NAPLAN scale points to EYL (Appendix 3); cut points for NSS and EYL
(Appendix 4); and the observed learning curve and percentiles (Appendix 5).
Grattan Institute 2016
NAPLAN measure
Achievement Scale score: the NAPLAN
score a student receives in
a given test
Progress
Proposed new measure
Equivalent year level (EYL): the
year level at which a typical
student would be expected to
achieve a given scale score
Gain score: the difference Years of progress: the
in NAPLAN scores between difference in years and months
between equivalent year levels
two points in time
across two points in time
13
Widening gaps: what NAPLAN tells us about student progress
Examples of new interpretations now possible
Four illustrative examples of how the new measures change our
interpretation of NAPLAN are shown in Table 2. It shows the
progress of four groups of students between Year 5 and 7 and
how simply their performance can now be compared.
Group A made the best progress; making three years of learning
progress between Years 5 and 7. Next is Group D, which made
two years and three months of progress over the same time
period. This group made better than average progress from a
lower base and partly closed the gap to their peers. Group B is
next in order – they are well behind but kept pace with their
cohort, making two years of progress over the period. Group C
made the worst progress of only one year and eight months and
dropped further behind their cohort.
Table 2: Four illustrative examples showing who makes the most
progress between Year 5 to 7 using years of progress
Group
Equivalent year
level in Year 5
Equivalent year
level in Year 7
Years of progress
from Year 5 -7
A
Year 5
Year 8
+3Y 0m
B
Year 3
Year 5
+2Y 0m
C
Year 4 month 8
Year 6 month 4
+1Y 8m
D
Year 4 month 1
Year 6 month 4
+2Y 3m
Note: years of progress is defined in terms of years and months. For example +3Y 0m
refers to three years and zero months of progress.
By benchmarking all groups to the typical student we can
compare how well they are progressing relative to each other.
This is the case even when they have different starting points.
The new measure shows us observed progress, without trying to
account for different characteristics that might influence expected
rates of learning. It helps us see ‘what is’ more clearly.
2.3
Benefits of the new approach
Because growth is not linear in NAPLAN scores, gain scores
cannot be directly used to compare the relative learning progress
of different groups of students. This is true especially of student
groups who are at different parts of the growth curve.
Several mechanisms have been developed to avoid this limitation.
For example, My School allows comparisons of school-level gain
scores for students with the same starting scores, as well as
comparisons to schools with similar students. 23 The Victorian
Curriculum and Assessment Authority (VCAA) created a relative
growth measure that restricts gain score comparisons to students
with the same starting scores. 24
Our approach removes this restriction. As noted above, the years
of progress measure allows comparison of relative progress from
different starting scores. This is especially valuable for
policymakers involved in resource allocation or balancing priorities
across the system.
In addition, the new measure shows the rate of progress and the
scale of gaps in a way that is intuitive and tangible. Understanding
what a ‘year of progress’ looks like is accessible to policymakers
when setting system priorities.
23
See www.myschool.edu.au.
VCAA (2012). An explanation of relative growth measures and other NAPLAN
reporting measures are included in Appendix 2.
24
Grattan Institute 2016
14
Widening gaps: what NAPLAN tells us about student progress
Lastly, our metrics are based on the growth curve of the typical
student using national data. When state-level data on student
progress is benchmarked against this, it provides insights into
how progress at the state level compares to national trends.
2.4
How new is our proposed approach?
The concept of comparing student performance in years and
months of learning is not new. Internationally, the OECD
expresses differences in PISA scores in years and months of
schooling when comparing the performance of different groups of
students, states and territories and different countries. 25 While this
measure shows meaningful comparisons at age 15, it does not
show the progress of the same students over time now possible
with NAPLAN data.
The NSW Centre for Education Statistics and Evaluation (CESE)
has introduced a value-added modelling technique using years
and months of learning gain. This measure accounts for nonlinearity in NAPLAN scores but is used for the purpose of
understanding school effectiveness – a different focus to the new
measures suggested here. 26
There is also an established concept of ‘grade equivalent scales’,
which has similarities to our equivalent year level metric. 27 Our
method builds on this approach but uses different statistical
techniques to improve the accuracy of estimates, particularly in
how we estimate equivalent year levels below Year 3 and above
Year 9. 28 Importantly, we apply our measure in a way designed to
avoid the key limitations of grade equivalent scales; in particular
we avoid comparisons at the individual student level. 29
2.5
Don’t change the NAPLAN scale
Our proposed new measures should not be taken to imply that the
NAPLAN scale is wrong or should be changed – indeed, our
approach would not be possible without it. NAPLAN scale scores
have been developed using the Rasch model, an advanced
psychometric model for estimating a student’s skill level. Our
measure simply builds on the existing scale to make it easier to
analyse relative student progress.
2.6
Limitations of this approach
Trade-offs have been made in the design of the proposed
measures between statistical purity, ease-of-use and the benefit
of being able to compare the progress of groups at very different
stages of learning. The metrics should only be used to analyse
large groups of students and should avoid extreme scores. When
25
See Thomson, et al. (2013) for examples of how the OECD uses years and
months of learning, p. xvii. The OECD estimate is different to our new measure
as it uses a different statistical technique in calculating years and months
learning, and does not reference progress to a common benchmark. Another
international body that uses a similar concept is the UK Learning Toolkit which
transformed effect sizes of successful interventions into years and months of
learning, see Education Endowment Foundation (2016).
26
They produce an estimate of years and months of learning for the difference
th
th
between the 10 and 90 percentile school. CESE (2014) p. 29.
Grattan Institute 2016
27
Grade (or age) equivalent scales are derived by calculating the mean or
median score for each given grade, and then interpolating the year-and month
values between. Angoff (1984)
28
To the best of our knowledge, it is the first time this concept has been applied
to a vertically equated scale such as NAPLAN.
29
Angoff (1984); Sullivan, et al. (2014); Pearson (2016). The limitations of grade
equivalent scales are most pronounced for the comparison of individual
students.
15
Widening gaps: what NAPLAN tells us about student progress
used this way, we consider the new metrics are sufficiently robust
for informing policy decisions.
Of course, the new measures have limitations. A key area for
further development is refining the estimation of equivalent year
levels before Year 3 and after Year 9. 30 NAPLAN does not test
students outside these years. Our EYL estimates are based on a
very large amount of observed data, but caution must be taken in
interpreting results outside these ranges, for the reasons
discussed in Box 3. 31 But on balance, extending the EYL scale to
cover most of the observed range of student achievement makes
the approach much more useful to policymakers.
This new measure should not be used to make high stakes
decisions for individual students (e.g., placement into a remedial
or accelerated class) or teachers (e.g., promotion). In part this is
because measurement error in NAPLAN scores is high for
individuals or small groups. Linking the measure to high-stakes
decisions could also increase the likelihood of ‘teaching to the
test’, and other adverse outcomes. 32
Without further testing of the measure, we would also caution
against using this measure to directly compare school progress. It
is designed primarily to compare progress at a system level.
In this report we only analyse data at a group level using large
groups. We have taken further precautions to limit the impact of
30
Restricting results to students who score between EYL 3 and EYL Year 9
would severely limit analysis.
31
The EYL scale below Year 3 is estimated from the observed progress to Year
5 of students who were below the median in Year 3. By definition, this is half the
Year 3 population. Above Year 9, the EYL scale is estimated from the observed
progress to Year 9 of above-median Year 7 students; again, half the population.
32
See Box 5 in Goss, et al. (2015), p. 39.
Grattan Institute 2016
measurement error. 33 This report primarily examines numeracy
results, but reading results display similar patterns. All findings
(including a full set of charts for reading and numeracy and the
associated 99 per cent confidence intervals) can be downloaded
from the Grattan website.
The new measures should be part of a suite of metrics
Our new measures should be used as part of a suite of metrics
answering a range of important questions on student
performance. Data on student progress should be considered
from multiple angles when setting system priorities.
The new measures help answer relative questions on student
performance. While these relative questions are critical, they are
not the only questions that matter.
For example, to understand the impact of school quality on
student outcomes – a very important but different question – it is
necessary to look at value-added measures. Value-added
measures help to identify the impact of the school which is useful
in understanding which interventions are working well and why. 34
33
We compare EYL and progress for large sub-groups of students, and avoid
calculating statistics for extreme scores where measurement error is likely to be
high. For most of our estimates the 99 per cent confidence interval is between 3
and 10 NAPLAN points, and less than 6 months in equivalent year levels.
34
Value added measures compare the progress each student makes relative to
all other students with the same initial level of achievement, while controlling for
socio-economic factors.
16
Widening gaps: what NAPLAN tells us about student progress
Box 3: Interpreting equivalent year levels
Our equivalent year level (EYL) metric needs to be interpreted
carefully, especially in relation to the school curriculum. For instance,
a group of Year 5 students at EYL 9Y 0m in numeracy are not
necessarily ready to solve mathematics equations aimed at a Year 9
student; they may not have been exposed to concepts that need to
come first. 35
However, on the tasks tested in NAPLAN, these Year 5 students
demonstrate comparable numeracy skills to the median Year 9
student. Further testing may be appropriate to see whether they
need to be stretched in their learning.
Interpreting equivalent year levels below Year 3 requires care.
NAPLAN tests are not designed for students below Year 3. Yet,
many students in Year 1 do have reading and numeracy skills that
are comparable to the expectations for Year 3 students. 36 Likewise,
many Year 3 students demonstrate reading and numeracy skills at a
Year 1 level.
Interpreting equivalent year levels above Year 9 is even more
challenging. A student who is two years ahead of the NAPLAN Year
9 median in numeracy is said to be at EYL 11y 0m. 37 This does
not necessarily mean the typical Year 9 student will reach this skill
level in Year 11. Students choose specialised subjects in senior
secondary school, and teaching may focus more on specific content
than general literacy and numeracy skills.
In a technical sense, it is more precise to say EYL 9 + 2Y 0m rather
than EYL 11Y 0m. But this notation is cumbersome and confusing,
and so has not been used in this report. Instead, we report
equivalent year levels directly in years and months between EYL 1
and EYL 12.
In addition, equivalent year levels are more limited in assessing the
performance of high achieving students and high performing schools.
Because NAPLAN testing stops at Year 9, there is no reference for
how well high achieving students are progressing in higher year
levels at present. Within our methodology, we can still compare high
performing students to an estimate of how the typical student would
perform. However, we also suggest that international test data (such
as PISA) is used when assessing the performance of high achieving
students.
35
Likewise in reading, Year 5 students at EYL 9 have much stronger reading
skills than the median Year 5 or even Year 7 student, but may not be ready for
the concepts in a book aimed at Year 9 students.
36
See, for example, Figure 6 and Box 1 in Goss, et al. (2015), pp. 27-28.
Students’ counting abilities were tested using the Mathematics Assessment
Interview, a one-on-one test administered by their teachers. About one quarter of
the Year 1 students (16 out of 66) already demonstrated counting skills at or
above the level of skills typically taught in Year 3.
37
Conceptually, this is estimated by analysing the typical progress of students
who achieved the median Year 9 score when they were in Year 7.
Grattan Institute 2016
The following chapters analyse relative student progress in
Australia and Victoria using the equivalent year level and years of
progress measures. Chapters 3 and 4 set out the findings
revealed by the new approach, and Chapter 5 discusses their
economic implications.
17
Widening gaps: what NAPLAN tells us about student progress
3
The spread in achievement widens dramatically as students progress through school
This chapter shows the spread in student achievement across a
given year level, and how the spread changes from the first time
students sit NAPLAN in Year 3 to the time they sit it in Year 9. 38
Our equivalent year level measure shows that the spread widens
dramatically after Year 3, suggesting that certain students are
falling further behind as they progress through school. It paints a
very different picture to the one we see using NAPLAN scale
scores.
Understanding the spread in student achievement is important for
policymakers. It directly affects the work of every teacher.
Teaching a class full of students who are at different stages in
their learning is inherently difficult. A large spread makes the
challenge greater.
An increasing spread also has implications for learning. As
students move through school, some fall very far behind. Effective
learning involves ideas and concepts that build on one another.
Early delays in foundational literacy and numeracy skills can
affect the ability to catch up later on. Our findings show there are
real dangers for students who fall behind in their early years at
school. Most will never catch up without effective targeted
teaching or specific remedial support that accelerates their
learning.
3.1
Two different ways to measure the spread in student achievement
are contrasted in Figure 5. Using NAPLAN scale scores (the chart
on the left hand side), the spread remains relatively constant at
each year level shown after Year 3. This holds true for students
achieving in the middle 60 per cent of results (i.e. between the
20th to 80th percentiles), as well as for students achieving in the
middle 80 per cent of results (between the 10th and 90th
percentiles).
A different picture emerges using our new measure of equivalent
year levels (the chart on the right hand side). On this measure,
the spread actually widens after Year 3. In fact, the spread for the
middle 60 per cent of students more than doubles between Year 3
to Year 9, from 2 years 5 months to 5 and a half years. 39 We
estimate that by the time they reach Year 9, the top 10 per cent of
students are around eight years ahead of the bottom 10 per cent.
These findings refer to the spread in achievement across all
students in the Victorian population. When data is analysed at the
school level, the spread is only slightly smaller. In a typical school,
the spread in Year 9 is around seven years. 40 This presents an
extremely challenging task for any teacher.
39
38
Two different datasets are used in this chapter. National data (2014) is used to
analyse student spread. Victorian linked data is used to analyse student
progress. It is a linked dataset that allows us to track the results of each student
from 2009 to 2015. Victorian data is compared to a national growth curve to
provide insight on how Victoria compares to other states and territories.
Grattan Institute 2016
The achievement spread widens during schooling
th
th
In numeracy, the spread in equivalent year levels between the 20 and 80
percentile student is 2Y 5m in Year 3; 3Y 8m in Year 5; 5Y 2m in Year 7; and 5Y
6m in Year 9. A similar pattern, of a greatly widening spread of learning, is also
seen when we translate NAPLAN reading data into equivalent year levels.
40
This estimate is based on Grattan analysis of ACARA (2014b). It is based on
th
th
students in the 10 and 90 percentiles for a typical school, see Appendix 6.
18
Widening gaps: what NAPLAN tells us about student progress
Figure 5: NAPLAN scale scores suggest the spread stays constant,
but equivalent year levels shows it is increasing
Achievement spread by actual year level, numeracy, Australian students,
2014
NAPLAN scale score
700
600
Percentiles
90th
80th
50th
20th
10th
Equivalent year levels
Figure 6: Many students are performing several years ahead or
behind the median for their year level
Equivalent year level grouping, numeracy, Australian students, 2014
Year 3
Year 5
Year 9
Year 7
Higher
12
12
11
10
Median
9
9
8
7
500
6
6
5
4
400
3
3
2
Lower
300
0
0
Year 3 Year 5 Year 7 Year 9
Year 3 Year 5 Year 7 Year 9
Notes: Data includes all Australian students who sat NAPLAN numeracy tests in 2014.
The top ten per cent in Year 9 are above equivalent year level 12 and are not shown on
this chart. Results at the 10th and 90th percentiles are subject to higher measurement error.
Source: Grattan analysis of ACARA (2014b).
Figure 6 shows that many students are performing several years
ahead or behind the average group in their year level. The
proportion of students performing far away from the median group
increases each year level after Year 3 (i.e. the shape of the
distribution flattens with time). In Year 3, approximately 10 per
cent of students are at least 3 Years above or below the median
Grattan Institute 2016
20
40 0
20
40 0
20
Percentage
40 0
20
40
Notes: Data includes all Australian students who sat NAPLAN numeracy tests in 2014. We
account for measurement error associated with students who did not sit the NAPLAN tests.
Source: Grattan analysis of ACARA (2014b).
group. By Year 9, around 45 per cent of students are at least 3
years above or below.
A large spread is not only difficult for low achievers, but high
achievers as well. They are unlikely to be challenged by the
standard tasks for their year level, which are years below their
capability. Targeted teaching is vital to keep pushing them to the
next stage in their learning (see Box 4).
19
Widening gaps: what NAPLAN tells us about student progress
Box 4: Targeted teaching is vital given the increasing spread
3.2
This report uses NAPLAN data to show just how wide the spread
in achievement is at any given year level. Given this spread, it is
very important that teachers understand what level students are at
in their learning and how they can tailor teaching to their needs.
To do this, teachers need more accurate and timely data about
what each student knows and is ready to learn next. This data
must then be used: it has no impact unless teachers change their
classroom practice.
Our findings show that spread is much greater after Year 3,
suggesting that some students are falling very far behind while
others are very far ahead. We test this by analysing groups of
Victorian students who sat NAPLAN in Year 3, 5, 7 and 9 over
2009-2015. 41 The progress of high and low achievers is compared
between Year 3 and Year 9. 42 Again, the findings under our new
approach are contrasted to the outcomes that are suggested
using a face value interpretation of NAPLAN gain scores.
While NAPLAN data can tell us about the spread of achievement,
a child only sits the NAPLAN test every two years, so it is no
substitute for regular in-class evaluation. Teachers need to adapt
their teaching to student needs from week to week.
NAPLAN gain scores (see Figure 7) suggest the gap between
high and low achievers narrows between Year 3 and 9. Gain
scores are larger for low achievers (+211 points) compared to
median achievers (+182 points) and high achievers (+156 points).
Taken at face value, this suggests that low achievers make better
learning progress during this period than high achievers.
Using data to meet each student at their point of need is targeted
teaching – the subject of our last school education report.
Targeted teaching benefits all students, especially those students
working well outside year-level expectations. High performing
students get stretched. Struggling students get supported.
Source: Goss et al. (2015) Targeted Teaching
Low achievers fall more than three years behind
By contrast, our equivalent year level measure (Figure 8) tells a
different story. The gap does not narrow, but increases with time.
Between Year 3 and Year 9, the students with a low score in Year
3 are an extra year behind the top students by Year 9. These
students are two years eight months behind in Year 3, and three
years eight months behind by Year 9. They make one year less
relative progress over the same timeframe.
41
The Victorian data is compared to a national estimated learning trajectory to
provide relative comparisons.
42
th
‘Low’ and ‘high’ achievers are those who achieve in the lower 20 percentile
th
and top 80 percentile in Year 3 respectively. Results for Years 5 to 9 are the
predicted growth trajectory for the median student in each of these percentiles.
See the Technical Report for further details.
Grattan Institute 2016
20
Widening gaps: what NAPLAN tells us about student progress
Figure 7: NAPLAN gain scores can be misinterpreted to suggest
low achievers in Year 3 start catching up
NAPLAN scale score, numeracy, median, Victoria, 2009–15
Figure 8: In fact, low achievers fall further behind by Year 9 if
equivalent year levels are used
Equivalent year level, numeracy, median, Victoria, 2009–15
700
11
High
High
600
+ 156
76
Medium
9
Low
7
Medium
+ 6y 9m
3y
8m
+ 182
500
+ 6y 2m
+ 211
400
Low
132
5
+ 5y 9m
3
300
Year 3
2009
Year 5
2011
Year 7
2013
Year 9
2015
Notes: Results show the estimated gain scores between Years 3 and 9 of low, medium and
high achievers in Year 3 (students who scored at the 20th, 50th and 80th percentiles). Black
values indicate the gap between highest and lowest groups. Coloured values are the gain
scores over the six year period from Year 3 to 9.
Source: Grattan analysis of Victorian Curriculum and Assessment Authority (VCAA) (2015)
and ACARA (2014b).
Grattan Institute 2016
1
2y
8m
Year 3
2009
Year 5
2011
Year 7
2013
Year 9
2015
Notes: Results show the estimated progress of low, medium and high achievers (students
who scored at the 20th, 50th and 80th percentiles in Year 3) between Years 3 and 9. Black
values indicate the gap between highest and lowest groups. Coloured values are the years
of progress gained over the six-year period from Year 3 to Year 9.
Source: Grattan analysis of VCAA (2015) and ACARA (2014b).
21
Widening gaps: what NAPLAN tells us about student progress
In Figure 8, Victorian low achievers make particularly little
progress between Years 5 and 7 (the slope is flatter during this
period). Because Victorian data is benchmarked to the national
growth curve, this suggests that Victoria’s slower growth than the
national average. 43
Divergence can now be seen in NAPLAN results
Our new findings show a widening gap in student achievement as
students progress through school that until now has been difficult
to see in NAPLAN data. This gap aligns with a large body of
evidence on divergence. Early low achievers tend to become
further and further behind with time, while high performers
continue to excel (see Box 5).
Many factors may contribute to differences in rates of learning,
including inherent student learning ability. However the
divergence literature tells us there is often a mix of cognitive and
motivational forces at play once students miss key concepts early
on.
Box 5: Divergence: early struggles affect future learning
The literature shows that early low achievers often face an
ongoing struggle through their schooling years, while initial high
achievers continue to reap rewards from early success. Over time
we expect to see divergence in student results. 44 Early reading
and mathematics skill acquisition is linked to future success in
learning, also known as the ‘Matthew Effect’. 45 How does this
occur?
Learning involves ideas building on one another. Concepts or
skills that are missed early on can impede the take-up of new
skills down the track. In addition to cognitive barriers, there are
also motivational effects.
For example, with reading, students who struggle to master
‘decoding’ of spelling-to-sound early on tend to read fewer words
than their peers. 46 With limited vocabulary, these students start to
enjoy reading less and spend less time practising, so their overall
reading development slows. This can then affect participation in
other subjects such as science and history which depend on
reading to learn, and they can fall further behind in other subjects
as well. 47
44
43
Victorian median achievers also make less than two years of progress
between Years 5 and 7, which is low compared to the national growth curve.
Further exploration is required to understand related factors for this slump for
Victoria during these years.
Grattan Institute 2016
Masters (2005); Allington (2008); Masters (2013); Claessens and Engel
(2013); O'Donnell and Zill (2006)
45
Masters (2005), p. 17; Allington (2008); Dougherty and Fleming (2012);
Hanson and Farrell (1995)
46
Cunningham and Stanovich (1997)
47
Stanovich (1986); Cunningham and Stanovich (1997); Claessens and Engel
(2013)
22
Widening gaps: what NAPLAN tells us about student progress
3.3
NAPLAN national minimum standards are too low
The Australian NAPLAN national minimum standards (NMS) seek
to identify “students who may need intervention and support to
help them achieve the literacy and numeracy skills they require to
satisfactorily progress through school.” 48 The standards also
represents the basic level of knowledge and understanding
needed to function at a given year level. 49
Figure 9: National minimum standards are set very low
NAPLAN scale score, median growth curve, numeracy, Australian
students, 2014
600
Estimated PISA
proficiency
baseline
Year 9 NMS is below the level of
average Year 5 student
500
The minimum standard is extremely important not only for
schools, teachers and parents, but especially for policymakers
who need to know which students require extra support.
Year 9 NMS
Year 7 NMS
400
Year 5 NMS
NMS are set extremely low in Australia. Figure 9 shows that in
numeracy:
•
a Year 5 student at NMS is functionally operating below a
Year 3 level (over two years behind their peers)
•
a Year 7 student at NMS is functionally operating below a
Year 4 level (over three years behind their peers)
•
a Year 9 student at NMS is functionally operating below a
Year 5 level (four years behind their peers).
48
ACARA (2015), p. v. More specifically, ACARA specifies that 1) students who
do not meet the national minimum standard at any year level may need
intervention and support, and 2) students who are performing at the national
minimum standard may require additional assistance to enable them to achieve
their potential.
49
ACARA (2016b)
Grattan Institute 2016
300
Year 3 NMS
1
200
2
3
4
5
6
7
8
Equivalent year level
9
10
11
12
Note: Results show NMS and PISA minimum proficiency standard mapped to EYL.
Source: Grattan analysis of ACARA (2014b), and PISA minimum standard OECD (2012).
In other words, students who are two, three, and four years
behind others in their class are, according to current definitions,
considered to be ‘at minimum standard’. Can these students
effectively participate in a class where the curriculum and
teaching is often aimed at those much closer to the average
student?
Further, the NMS slips by almost one equivalent year level every
cycle of NAPLAN testing; i.e. in Year 5 it is over two years behind
peers, in Year 7 it is more than three years behind, and in Year 9
it is four years behind. This tacitly accepts a minimum standard
23
Widening gaps: what NAPLAN tells us about student progress
that assumes students will slip one year of learning further behind
each time they sit the NAPLAN test. This is similar for reading.
The situation is worse in mathematics where 20 per cent of
students fail to achieve the international baseline level. 53
The Australian NMS appear very low by international standards.
The minimum standard set by the OECD in PISA mathematics for
15 year olds is about two years above Australia’s numeracy
standard for Year 9 students, as seen in Figure 9. 50
The transition to NAPLAN Online is the time to make the change
Nationally, very few students are below the NMS. In 2015, 7.7 per
cent do not meet NMS in reading in Year 9. In other years of
NAPLAN testing, the proportion below NMS ranges from around 4
per cent to 7 per cent for reading and numeracy. 51 In fact, very
few students below the NMS actually sit the test (many are
students which are exempt). 52
Internationally, a much higher proportion of Australian learners
are below expected standards set in international tests. For
example, in the PISA 2012 test results, an estimated 14 per cent
of students fail to achieve a baseline proficiency level in reading.
50
PISA sets their level 2 as baseline proficiency and defines this as the level at
which students begin to demonstrate the mathematical literacy competencies
that will enable them to actively participate in life situations. Thomson, et al.
(2013), p. 20. We find that a NMS set equivalent to the PISA proficiency
standard would be around 2 years higher for Year 9 NAPLAN numeracy. We
estimate the PISA minimum standard by equating the percentile at which
students were below PISA proficiency in 2012 numeracy (19.7 per cent) to the
same percentile of achievement for Australian students in NAPLAN numeracy
2014. We note that PISA test takers are about six months older than Year 9
students, on average.
51
ACARA (2016a).
52
The proportion of students below NMS includes many exempt students.
Students commonly exempt from testing include those with a language
background other than English, who arrived from overseas less than a Year
before the tests, and students with significant disabilities.
Grattan Institute 2016
The Australian NMS accept a very slow rate of student progress
and are well below international standards. They were set in 2008
with the introduction of NAPLAN. There has been public criticism
of their very low level, and there is little publicly available
justification for setting such a low level. 54
Australian policymakers are currently reviewing the NMS, and
new measures will be announced in 2016 to accompany the
transition to NAPLAN Online. We understand that a new, higher
proficiency level is likely to be defined. We would welcome such a
move. Our analysis suggests that NMS should either be raised or
removed altogether.
Baseline levels of skills can be politically difficult to reform, but
other countries have successfully raised standards when required.
For example, many US states recently raised their proficiency
standards to reflect the tougher standards in Common Core. 55
53
Thomson, et al. (2013), p. 26, 175
Main (2013); Lamb, et al. (2015)
55
Peterson, et al. (2016)
54
24
Widening gaps: what NAPLAN tells us about student progress
4
Students whose parents have low education fall very far behind
Chapter 3 shows that low achievers continue to fall further behind
their peers between Year 3 and Year 9. In general, low achievers
and high achievers have different rates of learning over time. But
are there other factors at play?
This chapter examines differences in progress made by students
according to their:
•
level of parents’ education (section 4.1)
•
school’s level of disadvantage (section 4.2)
•
geographic location (section 4.3) 56
For this analysis, parental education is used as a proxy for a
student’s socio-economic status (SES), but results are similar for
family occupation.
Victorian students are analysed simply because the data is readily
accessible. Some of the findings may look bad for Victoria, but the
overall pattern for Australia is likely to be worse. Evidence from
international PISA tests suggest educational outcomes in Victoria
depend less on student socio-economic background than in other
Australian states. 57
Box 6: Distinguishing between the effects of student
capability, parental education, school and location
Some findings in this chapter overlap. Differences in student
progress for disadvantaged students (Section 4.1) are also
captured in findings of students who attend disadvantaged
schools (Section 4.2). This is because disadvantaged schools by
definition have more students whose parents have low levels of
education. Similarly, disadvantaged geographic areas have
clusters of students with parents with low levels of education,
employment and income (Section 4.3).
Our findings show that on average, students whose parents have
lower levels of education have lower levels of achievement by
Year 3. So it is not surprising that this is also true of
disadvantaged schools and disadvantaged geographies.
What is surprising are our results showing differences for students
with similar capabilities. For students with the same level of initial
achievement in Year 3 (a proxy for similar capability), less
progress is made by disadvantaged students, at disadvantaged
schools, and in disadvantaged areas. This strongly suggests that
equally capable students are failing to reach their potential. This
holds for disadvantaged students at all ability levels in Year 3,
especially bright students from poor backgrounds in
disadvantaged schools.
56
Analysis by level of parental education, school disadvantage and geographic
location uses the same statistical techniques and process as used in the
analysis of student spread. See the Technical Report for further details.
57
Thomson, et al. (2013), p. 274-275
Grattan Institute 2016
25
Widening gaps: what NAPLAN tells us about student progress
4.1
Gaps widen for students whose parents have low
education
Our findings show that students make less progress over time on
average if their parents have low levels of education
themselves. 58 This is not news. But the size of the gap is
alarming. 59
Students whose parents have low levels of education fall two and
a half years behind by Year 9
This section compares the progress of students according the
level of education of their parents (where a ‘low’ level of parental
education is defined as below diploma, ‘medium’ is diploma level,
and ‘high’ is degree or above). 60
When Victorian students sat their first NAPLAN test in Year 3,
students of parents with low education performed on average ten
months below their peers from families with high education. By
Year 9, this gap had widened to over two years and six months
(30 months). The gap tripled during this timeframe, as seen
in Figure 10. It widens significantly during the middle schooling
years.
Figure 10: The gap between students with low and high levels of
parental education grows alarmingly between Year 3 - 9
Equivalent year level, numeracy, median, Victoria, 2009-15
11
Degree or above
9
+ 7y 2m
7
1
Diploma
+ 6y 1m
Below diploma
5
3
2y
6m
+ 5y 7m
10m
Year 3
2009
Year 5
2011
Year 7
2013
Year 9
2015
Notes: Results show the estimated progress of students grouped by their parents’ highest
level of education as a proxy for socio-economic status. Black values are the gap between
highest and lowest groups. Coloured values are the years of progress gained from Year 3.
Source: Grattan analysis of VCAA (2015) and ACARA (2014b).
58
Findings are for NAPLAN cohort (2009-2015) using Victorian data. Analysis of
the other complete NAPLAN cohort (2008-2014) shows a similar picture with
minor exceptions.
59
Results for numeracy are generally similar to findings for reading, with the full
set of charts available on the Grattan website.
60
Results for Years 5 to 9 are the predicted growth trajectory using quantile
regression for the median student in Year 3 from each group (by starting score,
parental education, ICSEA or LGA). See the Technical Report for further details.
Use of a cohort analysis minimises the impact of individual student differences
and the influence of measurement error. Analysis of the other complete NAPLAN
cohort (2008-2014) shows similar patterns, as does analysis of reading scores
(data not shown).
Grattan Institute 2016
26
Widening gaps: what NAPLAN tells us about student progress
Even when capabilities are similar in Year 3, students whose parents have low education fall up to two years behind
Figure 11: From the same Year 3 score, students of parents with low education make much less progress to Year 9
Years of progress between Years 3 and 9 by Year 3 score and highest level of parental education, numeracy, Victoria, 2009–15
12
12
2
Degree or above
Diploma
Below diploma
+7y 7m
9
9
+6y 6m
+7y 0m
1y
5m
+6y 4m
6
6
High Year 3 score
Medium Year 3 score
3
+5y 10m
6
+6y 1m
+5y 7m
+5y 10m
+5y 5m
3
12
9
9
1y
1m
6
1y
9m
3
3
Low Year 3 score
0
Year 3
2009
Year 9
2015
0
Year 3
2009
Year 9
2015
0
Year 3
2009
Year 9
2015
Notes: Results show the estimated progress of low, median and high achievers (students who scored at the 20th, 50th and 80th percentiles in Year 3) grouped by their parents’ highest level of
education as a proxy for SES.
Source: Grattan analysis of VCAA (2015) and ACARA (2014b).
The findings for disadvantaged students are even more
concerning when we take into account student capability. We
compare the progress of students with the same score in Year 3.
We then track their progress between Year 3 and Year 9 to see
whether any significant differences open up.
Grattan Institute 2016
Students who display similar potential in Year 3 have very
different growth trajectories depending on their parents’ education
level, as seen in Figure 11. Between Year 3 and Year 9, students
with poorly educated parents consistently make less progress
than similarly capable students whose parents are highly
educated.
27
Widening gaps: what NAPLAN tells us about student progress
This holds for any ability grouping of disadvantaged students:
•
Of students with low Year 3 scores, disadvantaged
students make one year and one month less progress
than similarly capable students with better educated
parents.
•
Of students with medium Year 3 scores, disadvantaged
students make one year and five months less progress.
•
Of students with high Year 3 scores, disadvantaged
students make one year and nine months less progress. 61
High achievers from disadvantaged families have the greatest lost
potential, losing one year and nine months between Year 3 and 9.
In fact, bright students from poor backgrounds make less
progress in total (5 years 10 months) than low achievers with
highly educated parents (6 years 6 months) between Year 3 and
Year 9. 62
PISA data shows that in terms of giving students of low education
backgrounds the support to become high achievers Australia has
slipped backwards slightly. Figure 12 shows the proportion of
students at age 15 who come from low socio-economic status
(SES) backgrounds but nevertheless achieve high scores.
Australia’s proportion slipped slightly between 2003 and 2012,
dropping from 8 per cent to 6 per cent. Australia now sits slightly
below the OECD average, and well behind many high performing
countries.
61
It is hard to accurately estimate the performance of high achievers using
equivalent year levels. However, it is well above what we would expect the
typical Australian student to achieve by Year 12.
62
The two approaches of NAPLAN gain scores and EYL show a very different
picture of student progress, explained in Appendix 7.
Grattan Institute 2016
Figure 12: Fewer low-SES Australian students perform at the
highest levels of achievement than a decade ago
Proportion of students from low SES backgrounds who perform in top
two bands of PISA tests
Hong Kong-China
Macao-China
Korea
Japan
Switzerland
Netherlands
2012
Poland
Canada
Portugal
Finland
Germany
Belgium
OECD average
2003
Spain
Italy
Ireland
Australia
Latvia
Austria
France
New Zealand
Norway
United States
5
10
15
20
0.0
5.0
10.0
15.0
20.0
Percentage
Source:OECD (2013a), page 590
28
Widening gaps: what NAPLAN tells us about student progress
4.2
Students in disadvantaged schools make less progress
This section analyses differences in student performance
according to whether they attend a low, medium or high
advantage school. We find that students in low advantage schools
perform worse on average. 63 Again, this is not surprising.
However, the size of the gap is alarming.
Students in disadvantaged schools are over three and half years
behind students in high advantage schools by Year 9
Students in disadvantaged schools perform well below their peers
in high advantage schools by Year 3, but the gaps grow much
larger as they move through school. As shown in Figure 13, the
gap grows from one year and three months in Year 3 to a
dramatic three years and eight months in Year 9.
Students in medium advantage schools are also reasonably far
behind. The gap grows to over two years behind their more
advantaged peers by Year 9.
Figure 13: Students in disadvantaged schools fall very far behind
between Year 3 and Year 9
Equivalent year level, numeracy, median, Victoria, 2009-15
High advantage
11
9
+ 7y 8m
7
+ 6y 0m
5
3
1
Medium
advantage
3y
8m
Low advantage
+ 5y 4m
1y
3m
Year 3
2009
Year 5
2011
Year 7
2013
Year 9
2015
Notes: Results show the estimated progress of students grouped by their school ICSEA.
Low, medium and high advantage schools are the bottom ICSEA quartile, middle two
ICSEA quartiles and top advantage ICSEA quartiles respectively. Black values are the gap
between highest and lowest groups. Coloured values are the years of progress gained
from Year 3.
Source: Grattan analysis of VCAA (2015) and ACARA (2014b).
63
We classify students into high advantage (top quartile), low advantage (bottom
quartile) and average advantage (middle two quartiles) schools according to the
Index of Community Socio-Educational Advantage (ICSEA) of the school they
attend. The VCAA (2015) data used reports an ICSEA range for the school each
student attended at the time of each NAPLAN test. ICSEA is an aggregate
measure at the school level of the socio-educational background of all students
at a school. For further information on the ICSEA measure see ACARA (2014a).
Grattan Institute 2016
29
Widening gaps: what NAPLAN tells us about student progress
Students with similar early potential do worse in disadvantaged schools, especially high achievers
Figure 14: From the same Year 3 score, students in disadvantaged schools make much less progress to Year 9
Years of progress, by students with same Year 3 score (low, medium, high) and school advantage, numeracy, Victoria, 2009–15
12
2
2
High advantage
Medium advantage
Low advantage
9
+7y 5m
9
+6y 10m
2y
0m
+8y 1m
2y 12
5m
9
9
1y
7m
6
6
+6y 3m
+5y 8m
6
+6y 0m
High Year 3 score
+5y 5m
+5y 9m
+5y 3m
3
Medium Year 3 score
3
6
3
3
Low Year 3 score
0
Year 3
2009
Year 9
2015
0
Year 3
2009
Year 9
2015
0
Year 3
2009
Year 9
2015
Notes: Results show the estimated progress of low, median and high achievers (students who scored at the 20th, 50th and 80th percentiles in Year 3) grouped by their school ICSEA (referred to
as low, medium and high advantage schools).
Source: Grattan analysis of VCAA (2015) and ACARA (2014b).
Figure 14 shows the progress of students with similar abilities in
low, medium and high advantage schools. As can be seen in all
three charts, even when students have similar scores in Year 3,
students in disadvantaged schools make less progress than
students in high advantage schools.
Grattan Institute 2016
This finding holds true for all ability students in Year 3 attending
disadvantaged schools:
•
Of students with low Year 3 scores, those in disadvantaged
schools make one year and seven months less progress than
similarly capable students in high advantage schools.
30
Widening gaps: what NAPLAN tells us about student progress
•
Of students with medium Year 3 scores, those in
disadvantaged schools make two years less progress than
similarly capable students in high advantage schools.
•
Of students with high Year 3 scores, those in disadvantaged
schools make two years and five months less progress than
similarly capable students in high advantage schools.
Bright students in disadvantaged schools show the biggest losses
in potential, making two years and five months less progress than
similarly capable students in high advantaged schools.
In fact, between Year 3 and Year 9, bright students in
disadvantaged schools make less progress (five years and eight
months) than low achievers in a high advantage school (six years
and ten months).
These findings do not mean that teachers, principals and other
staff in disadvantaged schools are doing a bad job. The results
reflect a mix of influences affecting students who attend these
schools (discussed in Box 7). 64 What they do highlight is a large
variation in student progress for different schools – and a gap
that, by Year 9, is simply too wide.
64
Once levels of parental education are taken into account, there is still a
residual gap. This finding should be treated with caution as indicating school
effects given some of the residual amount may be picking up unmeasured family
factors, discussed in Box 7. In addition, estimates in Figure 14 are only for
students who are in similar status schools in both Year 3 and Year 9, so are not
representative of all students. This partly explains why the gap for students of
low parental education is lower than findings at the school level.
Grattan Institute 2016
Box 7: What our analysis can and cannot say about
disadvantaged students and disadvantaged schools
Our findings should not be interpreted as showing a direct causal
link between parental education and student progress (Section
4.1), or the impact of a school on student progress (Section 4.2).
This is because our analysis does not attempt to isolate the
effects of specific factors on student achievement, as some other
studies do using techniques that isolate the impact of different
factors in a systematic way.
When we look at differences in student progress by parental
education, as in Section 4.1, for example, our results capture
some of the impact of other factors related to parental education,
such as household income, general expectations for learning, and
some school-level factors. We do not isolate the direct impact of
parental education on student progress, but we capture much of
the combined impact of a range of factors correlated with parental
education on student progress.
Similarly, the estimated gaps in student progress by school
advantage, as in Section 4.2, are capturing the impact of many
factors related to school advantage. Importantly, our findings do
not isolate the impact of the quality of the teaching in certain
schools – there are a range of other reasons why advantaged
schools make higher progress. For instance, the results capture
some household-level factors that correlate strongly with school
advantage – high-income households are more likely to send their
children to more advantaged schools, for example. The results
also capture other factors from within the school, such as student
peers, the school environment, and the general expectations for
learning from parents and the community. The quality of teaching
is only one factor that may be reflected in the results.
31
Widening gaps: what NAPLAN tells us about student progress
4.3
The impact of disadvantage plays out geographically
Figure 15: Inner-city students make the most learning progress
Median years of progress between Years 3 and 9, numeracy, Victoria,
2009-15
Very low (> 12 months behind)
Slightly low (3 – 12 months behind)
Typical progress (± 3 months)
Slightly high (3 – 12 months ahead)
High (1 – 2 years ahead)
Very high (> 2 years ahead)
Insufficient data
hand corner of Figure 15). But the greatest difference in growth is
clearly between city and country. Inner-city students make at least
one to two years more progress than suburban students, and are
up to two years in front of regional and rural students in some
areas. Policymakers wanting to support educationally
disadvantaged students can target them geographically, with
regional and rural areas most in need.
Figure 16: Average school advantage is higher in inner-city areas
Average student ICSEA, 2009-15 cohort
Low SES
Low-Medium SES
High-Medium SES
High SES
Insufficient data
Notes: Results show the estimated progress of students grouped by the Local Government
Area of their Year 3 school.
Source: Grattan analysis of VCAA (2015) and ACARA (2014b).
Does learning growth vary depending on where students
live? Figure 15 shows that it does. In fact, learning progress
closely mirrors the pattern of educational disadvantage across
Victoria, shown in Figure 16.
Notes: Students are allocated to the Local Government Area of their Year 3 school. ICSEA
(a measure of school advantage) in this dataset is attached to student data, as there is no
school identifier.
Source: Grattan analysis of VCAA (2015).
Students in the inner city make more progress than outer
metropolitan students (seen in the magnified chart in top right
Grattan Institute 2016
32
Widening gaps: what NAPLAN tells us about student progress
5
Closing the gaps would generate big economic benefits
The previous chapters show that some students fall many years
behind their peers by Year 9, especially those from disadvantaged
backgrounds. Achievement at school has real long-term impacts
on young people – it affects further study, employment, lifetime
earnings, as well as health and community engagement. 65
But improving educational outcomes will require tough decisions.
Some initiatives will need investment, from within the existing
schools budget or beyond.
How can policymakers decide which investments are justified?
One approach is to look for policies where social goals align with
economic growth. Even better are policies with positive financial
payback, because their long-term budgetary benefits outweigh
their costs.
This chapter explores three economic benefits of better
educational outcomes: higher individual earnings; lower welfare
costs; and stronger economic growth.
Economic returns should not override the goal of delivering a
quality education for all. The economic benefits simply strengthen
the case for improvements in education as a ‘win-win’ for
policymakers.
65
Norton (2012); OECD (2012b); ABS (2014a); ABS (2014b)
Grattan Institute 2016
5.1
Higher individual earnings
Strong learning progress at school leads to higher achievement
and better skills later on. Higher achieving students are more
likely to complete Year 12, and more likely to find work or move
into further study once they leave school. This impacts lifetime
earnings.
Completing Year 12 increases lifetime earnings by nearly 20 per
cent compared to an early exit from school. 66 A bachelor degree
boosts lifetime earnings by a further 40 per cent compared to the
expected earnings of a high school graduate. 67 For each
additional year of education, income is estimated to rise by an
average 10 per cent. 68
It is not just quantity of schooling that matters, but also levels of
achievement at school. But there is limited research on the impact
of higher achievement on earnings, given difficulties researchers
have in accessing confidential test score data. The few studies
available suggest a positive relationship between achievement
and earnings. 69
66
Cassells, et al. (2012), p. 30. Studies taking into account natural aptitude find
in the order of 8 to 30 per cent increases in annual earnings for each additional
completed year of schooling from Year 12 to Postgraduate qualifications. Leigh
and Ryan (2008); Leigh (2010)
67
Leigh and Ryan (2008); Leigh (2010)
68
Leigh and Ryan (2008); Leigh (2010). Cassells, et al. (2012) estimate an
individual with a bachelor’s degree is projected to earn $830,000 more than a
Year 12 graduate over their lifetime. For further literature see Jensen (2010).
69
French, et al. (2010); Ceci and Williams (1997). French et al. find that a 1 unit
increase in high school test scores (the GPA) in the US is associated with a 12 –
33
Widening gaps: what NAPLAN tells us about student progress
Other educational research shows a link between higher
achievement at school, and attainment and achievement later
on. 70 Students who do better at school are more likely to continue
their education, and then reap the benefits of higher attainment.
Higher test scores are associated with a greater likelihood of
completing school and attending university. 71
A recent ABS study showed that higher scores on NAPLAN Year
9 reading increased the likelihood that Tasmanian students would
finish Year 12. This was true in both disadvantaged and
advantaged areas. 72 Even for the Tasmanian students who left
school early, high achievers in Year 9 NAPLAN were twice as
likely to be engaged in work or study, compared to those in the
bottom two NAPLAN bands. 73
5.2
Initial modelling shows that poor life outcomes can
have large budgetary impacts
analysis does not identify causative impacts at this stage, only
simple correlations. 74
Initial modelling shows that poor life outcomes can have big
budgetary impacts, seen in Figure 17. Students who leave school
with no formal qualifications cost the New Zealand government an
average of $NZ 22,000 from age 16 to age 23. Most of this cost is
for welfare benefits, but about one fifth is for corrections. The total
cost is two to four times higher than for students who attain formal
qualifications.
The linked data offers the New Zealand Ministry of Education a
new way to break the intergenerational cycle of disadvantage. A
child whose mother has no formal qualifications is estimated to be
50 per cent less likely to obtain a school qualification. 75 The
ministry can identify the specific students most at risk of leaving
schools without qualifications, and intervene early. 76
Poor outcomes in school can have costs for individuals and
society. A good school education helps adults stand on their own
two feet.
New Zealand is using linked data to look at the relationships
between known risk factors (for example, parental education and
family welfare dependency) and the likelihood of poor outcomes
(welfare dependency and crime) as an adult. Importantly the
14 per cent increase in annual earnings. The standard deviation of GPA is 0.838
and 0.798 for males and females respectively.
70
French, et al. (2010)
71
Ibid.
72
ABS (2014a)
73
Ibid.
Grattan Institute 2016
74
Low education at school is not necessarily the direct cause of welfare and
correction costs to the state. The New Zealand Ministry of Education is currently
identifying correlations which will be used to target interventions with the
intention of improving life outcomes for at risk children. In time the project will be
able to provide greater insight on each factor’s causative impact and the
effectiveness of interventions.
75
Ball, et al. (2016). Within the linked data, the future outcomes before age 21
estimate found that 17.5 per cent of students would not obtain any school
qualification, whereas of students whose mother had no formal qualification 27.1
per cent would not receive any school qualification.
76
Data is linked across the Ministries of Education, Social Services, and Justice.
34
Widening gaps: what NAPLAN tells us about student progress
Figure 17: New Zealand analysis shows the high cost to the state of
individuals with no formal qualifications
Average individual welfare and corrections cost ages 16 to 23, by
highest educational qualification, $NZ
20000
5.3
Stronger economic growth
Economic growth and social development are closely linked to the
skills of the population. 77 Better education drives increased
productivity in the workforce and top-level skills that deliver
innovation in products and services. 78
Improving school education outcomes has a huge impact on
countries’ Gross Domestic Product (GDP). Across countries,
education and economic growth rates were strongly linked over
the forty years from 1960 to 2000. 79
15000
10000
5000
0
No formal
qualification
NCEA 2 or
equivalent
NCEA 3 or
equivalent
Level 4
qualification or
above
Notes: National Certificate of Educational Achievement (NCEA) is the main national
qualification for secondary school students in New Zealand.
Source: New Zealand Ministry of Education (used with permission in 2015).
Better educational outcomes are a major national asset, just as
much as a highway or a railway. But there is one important
difference: while infrastructure assets tend to depreciate in value
over time, the value of better education tends to increase. Better
skills foster greater innovation. Better educated adults tend to
raise better educated children. Unfortunately the reverse is also
true, which is part of the cause of intergenerational cycles of
disadvantage.
There is good evidence that higher literacy and numeracy skills
increase GDP growth. A series of studies estimate that one
standard deviation in test scores would lift the long-run GDP
growth rate by between 0.6 and 2.0 percentage points. 80 Using a
77
OECD (2015)
Ibid. p. 77
79
Ibid.
80
This range is based on a number of studies cited in Jensen (2010) p. 18., as
well as OECD (2015)
78
Grattan Institute 2016
35
Widening gaps: what NAPLAN tells us about student progress
conservative estimate, an increase of 25 PISA points would boost
Australia’s long-run GDP growth rate by 0.25 percentage points. 81
The economic benefits from better education outcomes accrue
over decades as higher skilled school leavers gradually form the a
larger and larger part of the workforce. The long-term rewards are
large.
Obviously, these estimates involve large degrees of uncertainty,
particularly given the length of time. But the evidence clearly
shows that lifting education outcomes can make a real difference
to the economy.
Box 8: Making large learning gains is tough but possible
Large learning gains can be made. Several other jurisdictions
have made big gains within a decade or so:
•
Poland made exceptional progress, gaining 39 PISA points in
reading since 2000 and 27 points in numeracy since 2003.
Poland now out-performs Australia in numeracy and is on par
in reading.
•
Hong Kong and Germany are high achieving countries that
have made large gains in reading and numeracy over time.
Germany gained 24 points and Hong Kong 19 points in
reading between 2000 and 2012. Germany gained 11 points
in numeracy between 2003 and 2012.
Is there scope for Australia to make large learning gains with
significant economic benefits?
Australia ranks in the top 20 countries in numeracy and reading
on PISA tests in 2012. But there is still much scope for
improvement.
Given the large decline in PISA points since 2000, we should start
by reclaiming lost ground and aim for where we once were. 82
Large change is hard, but not impossible. Examples of other
jurisdictions that have made large gains in reading and numeracy
are discussed in Box 8.
A number of Australian states and territories have also made
large learning gains:
•
Since 2008, Queensland and Western Australia have both
made gains of over 6 months in a range of NAPLAN reading
and numeracy tests. The gains are too recent to have shown
up in PISA tests.
Source: Grattan analysis of OECD (2012) Table I.2.3b and Table I.4.3b; NAPLAN data
from ACARA (2014) and ACARA (2015).
81
This is estimated using a conservative estimate of 1 percentage point per
standard deviation.
82
Australia’s performance has dropped by 16 PISA points in literacy and 20
PISA points in mathematics since 2000 and 2003 respectively. OECD (2014a)
Grattan Institute 2016
36
Widening gaps: what NAPLAN tells us about student progress
5.4
Smart investments
How can Australia significantly boost learning in school? While
our report shows that disadvantaged students are falling very far
behind, further analysis is required to identify exactly what the
best policy solutions are, and where the gains may be the
greatest.
Broadly speaking, there are three obvious areas likely to deliver
substantial improvements.
Firstly, investing early is likely to have large learning benefits.
The case for early investment in education is well established. 83
Queensland’s focus on the early schooling years since 2007
appears to have contributed to positive results. The introduction of
a Prep Year in 2007 provides a real-world experiment to test
potential impacts. Queensland’s NAPLAN scores have certainly
gone up since 2010, broadly in line with the cohort that first had
the extra year of Prep, although further evaluation is required to
confirm this strength of this relationship. The boost to future Year
9 NAPLAN performance looks like it will be about 6 months
(discussed further in Box 9).
While this analysis is very rough, it suggests that Queensland’s
investment in primary school will definitely deliver benefits. Even
after the high cost of adding a year of schooling has been
accounted for, the decision should deliver large economic benefits
derived from higher achievement levels in future.
83
Stanovich (1986); Cunningham and Stanovich (1997); Allington (2008)
Grattan Institute 2016
Box 9: Queensland's investment in early years of schooling
and likely benefits
Queensland has made several large investments in the early
years. It introduced a Prep Year in 2007. 84 It also raised the
compulsory school starting age for Year 1, and invested in a
significant strategy to improve schools, principals and primary
teaching.
Since 2008, Queensland’s NAPLAN scores have increased
significantly for Year 3-7 reading and Year 3-5 numeracy. 85 The
2014 and 2015 cohorts both performed about 6 months ahead of
the comparable 2008 cohort across all Year 3-7 NAPLAN tests.
The first cohort with the extra Prep Year is now in Year 9. The
pattern suggests that Queensland’s Year 9 NAPLAN results will
improve from 2016 on, potentially by six months of learning.
The investments in the early years appear to have delivered
substantial learning gains. The higher achievement of school
graduates in future is likely to provide Queensland with the skills
for a stronger and more prosperous economy in the long term.
Source: Grattan analysis, based on NAPLAN data from ACARA (2014) and ACARA
(2015), Tables TS.R14-TS.R21 and TS.N14-TS.N21.
84
The Prep Year was not made compulsory for all students when it was
introduced in 2007. It will be compulsory from 2017.
85
NAPLAN gains were estimated using linear regression on a three-year rolling
average score to minimise year-to-year fluctuations. Gains (or drops) were then
translated into equivalent years of learning. Comparing 2015 to 2008, reading
scores improved about 10 months for Year 3 (p<0.001), 13 months for Year 5
(p<0.001) and 4 months for Year 7 (p<0.05). Numeracy scores increased by
about 5 months for Year 3 (p<0.001) and 7 months for Year 5 (p<0.001), but
dropped by about 4 months for Year 7 (not statistically significant, p>0.1).
37
Widening gaps: what NAPLAN tells us about student progress
Secondly, closing the progress gap for disadvantaged students
can increase productivity.
Thirdly, investing in better systemic support for targeted teaching
would be a clear win.
Closing the gap is an obvious area for reform, especially for those
students who demonstrate potential in Year 3 but make much less
progress than their more advantaged peers by Year 9 (discussed
in Chapter 4). This is a clear productivity loss.
Helping teachers to teach to the very large spread in student
achievement that opens up (discussed in Chapter 3) is a key
policy priority. Unless teaching is well targeted to the current level
of each student, the low achievers will be lost in class, and the
high achievers will be bored. Neither will learn as much as they
could. Again, this is a big productivity loss.
As an interim target, we estimate that closing the progress gap by
half (for those of similar scores in Year 3) would add around four
months of learning on average across the student population. 86
This would add around 11 PISA points, and help to regain much
of our lost ground since 2000. 87 An interim target of this kind
would also bring Australia closer to countries such as Canada on
equity. Reaching this target will be hard, but it is achievable.
The evidence shows that ‘targeted teaching’ 88 has one of the
strongest evidence bases for improving learning outcomes
(discussed further in the policy recommendations in Chapter 6).
86
Grattan estimates the size of the gap for students from same Year 3 score is
an average of eight months (rounded from 7.65) across the Year 9 student
population in Victoria. This refers to the gap between students in low and
medium advantage schools compared to students in high advantage schools.
This estimate is based on our findings that students in low-advantage schools
are 1 year and 1 month behind in Year 9 from the same Year 3 starting score,
and students in medium-advantage schools are 10 months behind in Year 9 from
the same Year 3 starting score. Closing this gap by half is four months (3.83
months). This is a conservative estimate of the national progress gap, given
learning outcomes in Victoria are more equitable than in other states according
to PISA data, Thomson, et al. (2013) p. 274-275.
87
Grattan estimates these PISA points based on the OECD research that shows
34.73 PISA points is equal to one year of schooling. OECD (2014c), Table A1.2.
Using this ratio, an increase of 3.83 months on average across the population
would increase learning by 11.1 PISA points.
Grattan Institute 2016
88
‘Targeted teaching’ is discussed in a recent Grattan Report by Goss, et al.
(2015)
38
Widening gaps: what NAPLAN tells us about student progress
Box 10: Education funding and outcomes
Good education policy can improve education outcomes and bring
big economic benefits. Yet changing practice is hard and takes
resources. And effective reform is not a matter of throwing money
at the problem. 89
Three things are undeniable. The first is that how every dollar is
spent matters. A dollar that is wasted on an ineffective
intervention, 90 or on poor school infrastructure planning, 91 could
have been better spent elsewhere. When considering expensive
reforms the first question must always be: can existing money be
re-allocated to key priorities and away from waste?
The second thing is that money alone does not improve
learning. What matters is what is done with it, and what support is
given to schools to improve practice. 92 The third is that learning will
not improve unless classroom practice changes. This takes time,
especially for teachers and school leaders, who are the most
valuable – and costly – resource we have. Not investing the right
level of funding in this, whether from existing budgets or new
funding, will guarantee failure.
Australia needs a much more sophisticated discussion about the
education policies of our political parties. We spend too much time
talking about the amount of funding and not enough about how to
improve learning.
Even when we do talk about money, we should learn to ask the
questions that really matter. For example:
•
Is the existing level of funding being used strategically to
address the biggest education priorities?
•
Do schools get the support they need to help all students make
good learning progress, regardless of their background or
circumstances?
•
Is existing funding being used to support evidence-based
policies, with a clear and realistic plan to improve classroom
practice?
•
Where are we wasting money (i.e., spending money on things
that have little impact on learning, or that cost far too much for
the impact they have)?
The Gonski Review of Funding for Schooling (December 2011)
proposed that all schools, both government and non-government,
be funded on a transparent, consistent basis that takes into
account student need. The principle of needs-based funding
received broad support. Schools need certainty. However, current
funding arrangements remain complex and disputed.
89
High income countries that spend more on education do not necessarily have better outcomes. OECD (2012a)
CESE (2015)
91
See Goss (2016).
92
OECD (2012a)
90
Grattan Institute 2016
39
Widening gaps: what NAPLAN tells us about student progress
6
What policymakers should do
This final chapter offers recommendations to policymakers about
what they should do with our new approach and its findings.
1a Adopt Grattan’s new ‘years of progress’ approach to better
understand relative student progress
First, they should put relative student progress and learning gaps
at the centre of the policy agenda.
Grattan’s years of progress measure helps to assess relative
performance and is easy to use and interpret. It is designed for
Education departments and non-government system leaders to
use for their own internal analysis. It helps to identify patterns of
fast or slow progress and any issues that warrant further
analysis. 93 A better understanding of learning gaps helps leaders
set strategic priorities and targets, and see whether they are being
met.
Second, implement better systematic support for targeted
teaching, given the wide spread of student achievement
discussed in Chapter 3.
Third, improve efforts to lift the progress of disadvantaged
students to address the gaps discussed in Chapter 4.
The chapter concludes with a brief discussion of what not to do.
Recommendation 1: Put analysis of relative student progress
and learning gaps at the centre of the policy agenda and use
it to target policy and resources more effectively
The new measure should be used as part of a suite of measures
to assess student performance; understanding the data from
multiple angles is important.
1b Use relative progress analysis to set system priorities, inform
resource allocation and needs-based funding policies
While NAPLAN data tells us a lot, it is difficult to see relative
student progress at present. How can policymakers set priorities
and make informed policy and resource decisions if they cannot
see how well different students are progressing relative to each
other?
Students who make good learning progress during school have
the opportunity to realise their educational potential. Analysis of
current progress and learning gaps should therefore inform
system priorities, resource allocation, as well as needs-based
funding policies.
Many of the current analytical approaches to assessing relative
progress in NAPLAN are highly technical. Other approaches only
allow comparisons between students from similar backgrounds or
with similar scores. This is too limiting and confusing.
This is not just about money. Students with low progress or poor
outcomes may need different types of resources or support. Or
existing resources may need to be used more effectively.
Grattan Institute 2016
93
It would also complement the value-added models that some education
systems use, which analyse the difference between expected and observed
progress.
40
Widening gaps: what NAPLAN tells us about student progress
Additional support is warranted where progress or outcomes
remain too low, despite best efforts, or if gaps remain too wide.
Failure has too high a cost – for the individual, for society and for
the economy.
To help fund this, education system leaders must ensure they get
good value from every existing dollar.
1c Education departments should continue to link up their student
data, and implement a national student identification mechanism
Targeting funding to where it is needed most relies on finegrained analysis of student progress and learning gaps. Linked
datasets that track individual student progress over time are an
invaluable resource for policymakers.
Much of the analysis presented in this report depends on the
linked student dataset provided by Victoria. Not all state and
territories in Australia have a linked dataset of this quality. 94
Those that don’t are missing a valuable analytical tool. They
should link up their own student data. This should begin with
NAPLAN data; over time, other achievement and administrative
data could be added.
Linked datasets would be much easier to develop and maintain if
there were better mechanisms to identify students. Student
identifiers have traditionally been managed at school and/or
sector level. 95 But students move between interstate and between
school sectors. With NAPLAN moving online, now is the time to
implement a national student identification mechanism.
94
95
It is unclear if linked datasets are widely available to non-government sectors.
Australia has three school sectors: government, Catholic and independent.
Grattan Institute 2016
In this light, we welcome a recently announced Productivity
Commission Inquiry which will examine ways to improve the use
of data by the school and early childhood sectors. 96
Recommendation 2: In light of the very large spread in
student achievement, implement better systematic support
for targeted teaching so that all students make good learning
progress
Everyone knows that students are different. Some are well ahead
in their learning; others are well behind. Chapter 3 shows just how
large spread in achievement is, especially in secondary school.
Grattan’s previous school education report, Targeted teaching,
directly addresses the challenge of this spread. The best teachers
assess what each student knows already, target their teaching to
meet each student’s learning needs, track students’ progress over
time, and adapt teaching practices according to what works
best. 97
Targeted teaching is not new, but it is still much too rare in
Australia. Too often, schools are left to figure it out on their own.
This has led to pockets of great practice, but not systemic
improvement. All schools have high and low achievers, all of
whom deserve to be taught at their current level. Targeted
teaching accelerates learning and would improve the productivity
of school education.
Better system-wide policies and systematic support for targeted
teaching are needed. While a number of good initiatives exist, in
96
97
See Productivity Commission (2016).
Goss, et al. (2015)
41
Widening gaps: what NAPLAN tells us about student progress
many cases schools are left to reinvent the wheel themselves with
limited support.
2b Shift the focus in NAPLAN to proficiency. Either raise the
national minimum standard or remove it entirely
2a Strengthen system-wide policies around targeted teaching,
with an emphasis on giving teachers the time, tools and training
It is hard to aim high when the bar is set low. We do students no
favours by defining acceptable performance or progress using
inadequate benchmarks.
Education systems should support schools to give teachers the
time, tools, training to target their teaching. 98 When they do,
change can be achieved at scale. 99
Targeted teaching will not happen for free. Changing established
practice never does. 100 One option for funding targeted teaching is
to re-allocate funding away from grade repetition, a practice that is
known to be ineffective. Reducing the number of students
currently held back in school due to low performance would open
up funding for higher impact initiatives. 101 These two policy levers
are directly linked: targeted teaching reduces the need for low
performers to repeat a year of school.
98
Trust and teamwork also matter.
See Goss, et al. (2015); chapter 6 includes detailed recommendations, and
chapter 5 shows how some systems are delivering change at scale.
100
The main investment needed is on-the-ground professional development for
existing teachers. Relying on better Initial Teacher Education is far too slow.
101
Grade repetition is expensive and ineffective. Yet one in twelve Australian
students will repeat a year of schooling, three times as high as in the UK. It costs
at least $8000 to have a student repeat a grade, much more in some schools.
We estimate about $200 million is wasted on every new cohort of Prep (or
kindergarten) students as they move through school. Our estimate was simply
calculated by multiplying $8000 by 8.4 per cent – one in 12 – of the 310,198
students in Year 1 of schooling in 2014. Romanes and Hunter (2015). OECD
(2013b).
99
Grattan Institute 2016
Yet a Year 9 student reading at the national minimum standard is
achieving below the typical Year 5 student. No wonder only 8 per
cent of Year 9 students failed to meet this standard in 2015. 102
Worse, a student performing just above the national minimum
standard in Year 3 needs to make only about one Year of
progress every two years to stay above the minimum standard in
Years 5, 7, and 9. 103
The bar we are setting with the national minimum standard is just
too low. Importantly, setting such low standards increases the risk
of overlooking students who require additional support to make
adequate progress. Australia must raise its sights.
The focus of NAPLAN should shift to proficiency. We welcome
indications from ACARA that new standards of proficiency and
competency will be introduced along with NAPLAN online. The
national minimum standard must also be raised. If it is not, it
should be removed entirely.
102
103
ACARA (2015)
See Section 3.3
42
Widening gaps: what NAPLAN tells us about student progress
Recommendation 3: Given the very large gaps that open up
by Year 9, increase efforts to lift the progress of
disadvantaged students
complex behavioural issues. Such factors can make it difficult to
attend school, reduce parental support, and weaken students’
attitudes towards schooling and expectations of themselves.
Our analysis shows just how big the gaps between students of
different backgrounds are. Educationally disadvantaged students
– especially those with parents with limited education – tend to
start behind in Year 3. But they then fall much further behind
between Year 3 and Year 9, even from the same Year 3 score.
Schools with higher proportions of disadvantaged students often
face additional challenges. For example in attracting and retaining
staff, 105 equipping teachers with the skills to meet the learning
needs of disadvantaged students; and engaging parents and
carers in their children’s schooling. 106
Reducing the impact of disadvantage has big potential benefits.
But it is hard to do. Previous efforts to improve outcomes for
disadvantaged students and schools have met with limited
success. 104 Policymakers must increase efforts to do better.
Analysis of specific policy initiatives to address educational
disadvantage at the individual and school level is beyond the
scope of this report. However, five broad steps seem clear.
3a Make it a priority to increase the rate of progress of
educationally disadvantaged students, especially low performers.
Start early but also provide ongoing support after Year 3
Educational disadvantage can become an intergenerational cycle
of poverty, especially for those who show signs of low
performance early on. Assisting these students to break the cycle
can have a massive impact on their lives. They must be a key
priority in system level policies.
Educational disadvantage is complex, and so are potential
solutions. Disadvantaged students can face multiple challenges:
lower levels of parental education; higher levels of parental
unemployment; living in communities with fewer resources, and
104
Productivity Commission (2012), p. 274. Evaluations of the Smarter Schools
National Partnership, which provided $2.5 billion in funding to disadvantaged
schools, should be explored as a priority. One Victorian study shows positive
effects in some cases, Department of Education and Training (2016).
Grattan Institute 2016
Firstly, give all students at least one year of quality pre-primary
education.
Recent Australian evidence suggests that disadvantaged students
may benefit from starting early childhood education earlier. 107
Nearly 40 per cent of Australian students with no pre-primary
education were low performers in PISA mathematics at age 15.
By contrast, students with at least one year of pre-primary
education were half as likely to be low performers, even after
accounting for student background. 108 The Universal Access to
105
Low socio-economic primary schools are four times more likely to express
major difficulty in suitably filling staff vacancies than high socio-economic primary
schools. They are more than six times more likely to express major difficulty in
retaining suitable staff. Low socio-economic secondary schools and remote
schools report similar challenges. Productivity Commission (2012), Table 9.1
106
Ibid. p. 253, 257-267
107
Australian Institute of Health and Welfare (2015)
108
OECD (2016), Figures 2.13 and 2.14. The OECD analysis showed relatively
little additional benefit of attending more than one year of pre-primary education.
43
Widening gaps: what NAPLAN tells us about student progress
Early Childhood Education Program should be permanent. 109 The
quality of the programs is also vital. 110
least in part be funded by holding back fewer students later in
school.
Secondly, target teaching from the first week of primary school, so
that students have strong foundational skills by the end of Year 3.
Targeting teaching early will help build strong foundational skills
by Year 3, in reading, writing, and numeracy. John Hattie, the
Chair of the Australian Institute for Teaching and School
Leadership (AITSL), has emphasised that all children need to
learn to read and write by age 8 (generally Year 3), as those who
do not very rarely catch up. 114
Targeting teaching from the first week of primary school would
benefit all children. It would provide most benefit to children who
are developmentally vulnerable when they start school. 111
Research shows that schools in the most disadvantaged areas
have a higher proportion of these developmentally vulnerable
children starting school. This concentration of vulnerable children
in disadvantaged areas intensified during the period 2009 to
2015. 112
Children who are developmentally vulnerable when they start
school can get ‘back on track’ in their education, but it’s harder for
those from disadvantaged backgrounds. 113 If they are going to get
back on track, efforts need to start early. This makes
disadvantaged schools the ideal place for the ‘funding switch’
described above, where investments in targeted teaching could at
109
See https://www.education.gov.au/universal-access-early-childhoodeducation, accessed 14 March 2016.
110
Barnett (2011)
111
The Australian Early Development Census (AEDC) is run every three years,
to assess the developmental readiness of children as they enter their first year of
full-time school. It covers five domains: physical health and wellbeing; social
competence; emotional maturity; language and cognitive skills (school-based);
and communication skills and general knowledge. AEDC (2016)
112
For the language and cognitive skills domain, children in the most
disadvantaged areas in 2009 were 2.9 times more likely to be developmentally
vulnerable, relative to children in the least disadvantaged areas. By 2015, they
were 4.1 times more likely to be developmentally vulnerable. Ibid., p. 14.
113
Lamb, et al. (2015)
Grattan Institute 2016
Our analysis supports this claim. Early Action for Success – which
focuses on targeted teaching for students in early primary – is a
good example of a policy initiative that is working, even in the
most disadvantaged NSW schools. 115
Queensland’s recent experience of funding an extra year of prep
shows that investing early can deliver large benefits (see Box 9 in
chapter 5). Queensland students at or above national minimum
standards in Year 3 reading increased from 87.1 per cent in 2008
to 95 per cent in 2015.
Thirdly, provide extra support after Year 3, providing remedial
support as early as possible to all students who fall very far
behind their peers.
Efforts must not stop at Year 3. Our analysis shows that the gaps
between students of different backgrounds exist at Year 3, but
grow much wider by Year 9. Struggling students – whatever their
background – need remedial support as early as possible. It is
important to disrupt the cycle of low performance that leads to
114
Hattie (2015)
Early Action for Success focuses on students in K-2 in over 300 of the most
disadvantaged NSW government schools. Goss, et al. (2015), p. 35-36.
115
44
Widening gaps: what NAPLAN tells us about student progress
early disengagement. 116 The best way to fund effective remedial
support is to stop delivering remedial programs that have been
shown not to work, such as Reading Recovery. 117
As an interim target, closing the progress gap by half (for those of
similar scores in Year 3) would add around four months of
learning on average across the student population. 118 This would
add around 11 PISA points, and help to regain much of our lost
ground since 2000. 119 Achieving this target would have potentially
big benefits for individuals and the economy. It would also bring
Australia closer to countries such as Canada on equity.
Fourthly, various government and non-government bodies need to
be involved in these efforts.
The effects of family background on education outcomes are
complex. Education departments and schools can’t deliver great
outcomes on their own. Other departments and non-government
bodies play key roles in addressing the full range of challenges
that can impact student outcomes. 120
For example, positive role models and mentors from the
community can help to lift student expectations. Involving parents
in the learning of their children is also important. Community and
116
OECD (2016), p. 194.
CESE (2015)
118
See Chapter 5 for an explanation of how this is estimated.
119
We estimate these PISA points based on the OECD research that shows
34.73 PISA points is equal to 1 year of schooling. According to this ratio, an
increase of 3.83 months in learning on average across the population would
increase PISA test points by 11.1. OECD (2014c)
120
For example, the Smith Family’s Learning for Life program is a large nongovernment initiative working with disadvantaged students. The Smith Family
(2016)
welfare groups can be vital in addressing serious family problems
that affect learning at school. 121
Education leaders need to work closely with the many other
bodies doing important work to improve educational outcomes for
disadvantaged students.
Finally, given the size of the gaps, system leaders should
undertake further analysis on school effectiveness to isolate what
is working and why.
Our report examines how certain groups of students perform
relative to one another. While it highlights that some levels of
student progress are very low, our findings do not isolate the
specific impact of the school or teachers – it simply captures the
multitude of factors at play associated with the progress of
specific groups of students. Understanding the impact of the
school on student learning is a key in understanding what more
can and needs to be done.
To examine how good a job schools are doing requires analysis
that isolates the school effects (for example through regression
analysis or value added modelling). Further research in this area
should be pursued as a priority, given the new findings. Our years
of progress measure could be considered for this purpose in
future.
117
Grattan Institute 2016
3b. Strengthen support for bright students whose parents have
low education
The biggest relative gaps in learning progress are for bright Year
3 students from families with low educational backgrounds. Lifting
121
Discussed further in Turnaround Schools. Jensen and Sonneman (2014)
45
Widening gaps: what NAPLAN tells us about student progress
their progress between Year 3 and Year 9 may be help to lift the
number of high performing students overall.
Targeted teaching would help. Beyond that, there is limited
evidence about what works to promote excellence for
educationally disadvantaged students. A major policy priority is to
identify what works to support and stretch bright students from
poor backgrounds.
3c. As a priority, the Education Council should initiate and
oversee a coordinated national review of the quality and
effectiveness of school education for young people from
disadvantaged backgrounds
In 2012, the Productivity Commission found that “deficiencies in
evaluation make it difficult to identify the most effective ways to
address educational disadvantage”. 122 The evidence base was
weakest for initiatives related to low socio-economic status.
In response to this finding, the Schools Workforce review
recommended a coordinated national review of “existing evidence
on the effectiveness of programs and policies to help ameliorate
educational disadvantage”. 123 The recommendation has not been
implemented.
The newly announced Productivity Commission inquiry appears to
focus more on the quality of data collection and use rather than
the evidence of what works. 124 It will be very valuable. However,
the published terms of reference suggest it will not tell us how well
school education is working for young people from disadvantaged
backgrounds, or what programs, policies or practices work best.
A broader review is needed.
What not to do
Any analysis that shows large and growing learning gaps among
students could trigger well-meaning but inappropriate responses.
We highlight four things policymakers should not do as a result of
this report.
First, do not use the wide spread of achievement as an argument
for early streaming, or holding students back. 125 While streaming
may have small positive effects for bright students, taking these
students out of the classroom can have detrimental effects on the
learning of others. There are better ways to improve the learning
of every student, as discussed in our policy recommendations.
Second, do not use our years of progress approach to assess
individual student learning progress. It is not designed for this
purpose. Measurement errors are large, and the consequences of
poor decisions are too big.
Third, do not use our new approach as the basis for new ways to
reward or punish teachers and principals. Judging teachers and
schools based on their impact on student learning sounds highly
attractive. But it is very easy to get performance management or
incentive schemes very wrong, and no metric captures everything.
122
Productivity Commission (2012), p. 251
Ibid., Recommendation 10.3
124
“[T]his inquiry will help to identify current investment in national data collection
and education evidence, opportunities to collectively invest further, and how we
can improve the effectiveness of our investment through a more streamlined,
123
Grattan Institute 2016
comprehensive and collaborative national approach.” Productivity Commission
(2016)
125
The OECD recommends avoiding formal early streaming to reduce inequity in
education systems. OECD (2012b), p. 89; OECD (2016), p. 185
46
Widening gaps: what NAPLAN tells us about student progress
Finally, do not change how NAPLAN scores are calculated,
beyond what is needed for the shift to NAPLAN online. Australia is
fortunate to have a national assessment tool that is consistent
over time, and comparable across years of schooling. Instead, it is
the responsibility of those who use NAPLAN data to interpret it
appropriately and use the results responsibly, including to inform
and make policy. This report is our contribution to that effort.
Grattan Institute 2016
47
Widening gaps: what NAPLAN tells us about student progress
Glossary
ACARA
The Australian Curriculum, Assessment and Reporting Authority is an independent statutory authority responsible for
managing and developing the national curriculum and managing the Australian National Assessment Program. It is
responsible for the central management of NAPLAN.
Actual Year Level
The grade or year level of the student taking the NAPLAN test. For the median student, equivalent year level is equal
to actual year level in test years.
Equivalent Year
Level (EYL)
Corresponds to the NAPLAN scale score we expect the median student in the same actual year level to achieve.
Gain score
Gain scores are the difference in a student’s NAPLAN scale scores between two points in time.
ICSEA
The Index of Community Socio-Educational Advantage (ICSEA) is an aggregate measure at the school level of the
socio-educational background of all students at a school. It enables comparisons to be made across like-schools, that
is, schools whose students share similar socio-educational advantage.
National
Minimum
Standards (NMS)
The Australian NAPLAN NMS seek to identify students at risk of not making satisfactory progress without targeted
intervention. Students who do not achieve the national minimum standard at any Year level may need intervention and
support to help them achieve the literacy and numeracy skills they require to progress satisfactorily through their
schooling. For further information, see: http://www.nap.edu.au/results-and-reports/how-to-interpret/standards.html
NAPLAN
The National Assessment Program - Literacy and Numeracy (NAPLAN) was introduced as an annual test for Year 3,
5, 7 and 9 students in 2008. Testing covers four domains; Reading, Writing, Language Conventions (spelling,
grammar and punctuation) and Numeracy. It provides a standardised measure of student achievement around the
country.
OECD
The Organisation for Economic Co-operation and Development (OECD). The OECD Secretariat is responsible for the
day-to-day management of PISA.
Grattan Institute 2016
48
Widening gaps: what NAPLAN tells us about student progress
PISA
The Programme for International Student Assessment (PISA) is a triennial international survey which aims to evaluate
education systems worldwide by testing the skills and knowledge of 15-year-old students. Students representing more
than 70 economies have participated in the assessment.
Scale score (or
NSS)
A current NAPLAN measure. The scale score is an estimate of student ability at a given point in time. Scale scores
range from 0 to 1000, and are organised into 10 NAPLAN proficiency bands. All students who sit the test in either
Year 3, 5, 7 or 9 are scored along the same NAPLAN point scale within a given testing domain.
School
advantage
In this report’s analysis, school ‘advantage’ / ‘disadvantage’ status is derived from the Index of Community SocioEducational Advantage (ICSEA). Low advantage schools are those schools within the bottom quartile of ICSEA
scores for Victoria. Medium advantage schools cover the middle two quartiles and high advantage schools are those
in the top quartile.
Parental
education status
In this report’s analysis, parental education status is used as the proxy for student socioeconomic status. Parental
education status is known to be correlated with a range of other socio-economic factors that influence student
achievement.
Typical student
The typical student refers to the median student’s estimated learning progression through school. The ‘typical’ student
is a simple reference point for an abstract concept of the median student’s learning progress as observed in NAPLAN
tests.
VCAA
The Victorian Curriculum and Assessment Authority is an independent statutory body responsible to the Victorian
Minister for Education, serving both government and non-government schools. The VCAA provides curriculum,
assessment and reporting. It is the NAPLAN Test Administration Authority in Victoria.
Years of
(learning)
progress
The difference in years and months between equivalent year levels between two points in time for a given student or
for the median student within a group of students. This estimates how far a student (or group of students) is in front of
or behind their peers in years and months of learning.
Grattan Institute 2016
49
Widening gaps: what NAPLAN tells us about student progress
Appendices
Appendix 1: Metropolitan students consistently gain more NAPLAN scale points than remote students
Figure 18: Metropolitan students gain more points between
NAPLAN tests than remote students with the same starting score
Median NAPLAN gain score over two years by starting score and
location of school, numeracy, Australian students, 2012–14
160
120
80
Metro
Remote
40
0
250
300
350
400
450
500
Starting score
550
600
650
Notes: Median gain score estimated by absolute deviation regression of gain scores on
starting score. Source: Grattan analysis of ACARA (2014b).
Grattan Institute 2016
50
Widening gaps: what NAPLAN tells us about student progress
Appendix 2: NAPLAN measures of student achievement and progress
A large number of publicly reported NAPLAN measures focus on
student achievement. A common measure is the proportion of
students meeting national minimum standards, and how this
changes from one year to the next. Student achievement at the
top end is also a focus in key reporting metrics, including the
proportion of students in high-level NAPLAN proficiency bands.
compared to schools with similar students as well as the
average achievement of students with the same starting
scores.
•
Relative growth measures. These measures compare student
growth (as measured by gain scores) to what was typical for a
student with the same initial level of achievement. For
example, the Victorian Curriculum and Assessment Authority
reports student growth as ‘high’, ‘medium’ and ‘low’. This
identifies if a student was in the top 25%, middle 50% or
bottom 25% (respectively) of gain scores for their initial score
(it is similar to a value-added model). 128 Accounting for the
difference in expected gain from different starting scores helps
to provide an indication of how well students are doing.
•
Value-added models. The OECD has defined value-added as
the contribution of a school to students’ progress. 129 It
compares the progress each student makes relative to all
other students with the same initial level of achievement, while
controlling for socio-economic factors.
In addition a number of NAPLAN measures help to track student
progress:
•
Gain scores. Gain scores compare the difference in NAPLAN
points between two points in time.
•
Student cohort gain. Used in national NAPLAN reports, this
measure looks at the gain score for a population of students. It
is reported for each state and territory, as well as by student
characteristics (for example indigenous, gender, remote
students). The gain is estimated over two years for Year 3-5,
Years 5-7 or Years 7-9. 126
•
Student gain – average for schools. Reported on the
MySchool website, this measure shows the difference in
NAPLAN points between two points in time while controlling
for student starting score. It is a proxy for comparing the
growth of students with similar ability. 127 Results can be
126
In the annual NAPLAN report, this measure is reported with a focus on
differences among the two-year gains that are statistically significant (i.e. unlikely
to arisen by chance).
127
Matched students in the selected school are compared with students-in
statistically similar schools and across Australia with the same starting scores.
For further information: ACARA (2013a).
Grattan Institute 2016
128
NSW uses a similar approach in ‘SMART’ which defines expected growth as
th
being above the 25 percentile for the students initial score.
129
OECD (2008)
51
Widening gaps: what NAPLAN tells us about student progress
Appendix 3: Conversion of NAPLAN scale scores to equivalent year levels
Table 3: Conversion of NAPLAN scale points to equivalent year
levels, Australian students, 2014,
Equivalent Year Level
NSS Reading
NSS Numeracy
1
308
290
2
365
346
3
421
402
4
466
451
5
500
489
6
525
517
7
547
540
8
566
563
9
583
585
10
600
605
11
616
625
12
632
646
Source: Grattan analysis of ACARA (2014b).
Grattan Institute 2016
52
Widening gaps: what NAPLAN tells us about student progress
Appendix 4: Cut points for NAPLAN scale scores to equivalent year level
Table 4: Cut points for NAPLAN scale scores and equivalent year
level, reading, median, Australian students, 2014
NAPLAN Scale Score
(NSS) Cut points
Table 5: Cut points for NAPLAN scale scores and equivalent year
level, numeracy, median, Australian students, 2014
NAPLAN Scale Score
(NSS) Cut points
Reading Equivalent Year Level
Cut points
Bottom
Top
EYL Bottom
EYL Top
Span
Band 1
0
270
<1Y
<1Y
unknown
Band 2
270
322
<1Y
1y 3m
Band 3
322
374
1y 3m
Band 4
374
426
Band 5
426
Band 6
Numeracy Equivalent Year Level
Cut points
Bottom
Top
EYL Bottom
EYL Top
Span
Band 1
0
270
<1Y
<1Y
unknown
unknown
Band 2
270
322
<1Y
1y 7m
unknown
2y 2m
11m
Band 3
322
374
1y 7m
2y 6m
11m
2y 2m
3y 1m
11m
Band 4
374
426
2y 6m
3y 6m
1y 0m
478
3y 1m
4y 4m
1y 3m
Band 5
426
478
3y 6m
4y 8m
1y 2m
478
530
4y 4m
6y 2m
1y 10m
Band 6
478
530
4y 8m
6y 7m
1y 11m
Band 7
530
582
6y 2m
8y 11m
2y 9m
Band 7
530
582
6y 7m
8y 10m
2y 3m
Band 8
582
634
8y 11m
12y 2m
3y 3m
Band 8
582
634
8y 10m
11y 5m
2y 7m
Band 9
634
686
12y 2m
>13Y
unknown
Band 9
634
686
11y 5m
>13Y
unknown
Band 10
686
1000
>13Y
>13Y
unknown
Band 10
686
1000
>13Y
>13Y
unknown
Source: Grattan analysis of ACARA (2014b).
Grattan Institute 2016
Source: Grattan analysis of ACARA (2014b).
53
Widening gaps: what NAPLAN tells us about student progress
Appendix 5: Observed learning curve and percentiles
Figure 19: Learning curves and percentiles, numeracy
Estimated numeracy NAPLAN learning curve by percentile and NAPLAN
bands, median, Australia, 2014
Figure 20: Learning curves and percentiles, reading
Estimated reading NAPLAN learning curve by percentile and NAPLAN
bands, median, Australia, 2014
700
700
90th
80th
65th
50th
35th
20th
10th
90th
80th
65th
50th
35th
20th
10th
600
500
600
500
400
400
300
300
200
Year 3
200
Year 3
Year 5
Year 7
Year 9
Year 5
Year 7
Year 9
Source: Grattan analysis of ACARA (2014b).
Source: Grattan analysis of ACARA (2014b).
Grattan Institute 2016
54
Widening gaps: what NAPLAN tells us about student progress
Appendix 6: Spread in student achievement in a typical school
Figure 21: In a typical school, equivalent year levels show that the
spread of achievement is increasing
Achievement spread in a typical school by actual year level, numeracy,
Australian students, 2014
NAPLAN scale score
700
600
Percentiles
90th
80th
50th
20th
10th
Equivalent year levels
12
9
500
6
400
3
300
0
Year 3 Year 5 Year 7 Year 9
Year 3 Year 5 Year 7 Year 9
Notes: Results show the median spread of student achievement in a school. Data includes
all Australian students who sat NAPLAN numeracy tests in 2014. The top ten per cent in
Year 9 are above equivalent year level 12 and are not shown on this chart. Results at the
10th and 90th percentiles are subject to higher measurement error.
Source: Grattan analysis of ACARA (2014b).
Grattan Institute 2016
55
Widening gaps: what NAPLAN tells us about student progress
Appendix 7: Two approaches, two different pictures of student progress
The two approaches show different pictures about which students
make the most relative progress. In Figure 22, NAPLAN gain
scores show that high achieving students from highly educated
parents have lower gain scores (173 gain score points) than low
achieving students of parents with low education (205 gain score
points) between Year 3 and Year 9. This suggests that high
achievers from highly educated parents make less progress than
low achievers from poor backgrounds.
By contrast, our approach suggests the reverse. Students from
highly educated parents make more relative progress. They make
7 years and 7 months of progress compared to only 5 years and 5
months of progress made by low achieving students whose
parents have low education. This finding aligns with existing
research on divergence that low performance early on can lead to
snowballing effects, creating wider gaps over time.
Figure 22: Gain scores could be interpreted to show low achievers
whose parents have low education making more progress
Gain scores between Year 3 – 9, numeracy, median, Victoria, 2009–15
Figure 23: Our new measure shows the opposite: high achievers
from whose parents have high education make more progress
Years of progress Year 3 – 9, numeracy, median, Victoria, 2009–15
250
Degree or higher
Diploma
Below diploma
+228
+213
+205
205
8
+199
200
+171
6
+180
150
Degree or higher
Diploma
Below diploma
+6y
6m
+5y
+5y
5m 10m
+7y
7m
+7y
0m
+5y
7m
+6y
1m
+5y
10m
+6y
4m
+173
173
+139
+149
4
100
2
50
0
0
Low
Medium
Score in Year 3
Source: Grattan analysis of VCAA (2015)
Grattan Institute 2016
High
Low
Medium
Score in Year 3
High
Source: Grattan analysis of VCAA (2015) and ACARA (2014b)
56
Widening gaps: what NAPLAN tells us about student progress
References
ABS (2014a) 'Educational outcomes, experimental estimates,
Tasmania 2006-2013', accessed 2 March 2016,
from http://www.abs.gov.au/ausstats/abs@.nsf/mf/4261.6?
OpenDocument
ABS (2014b) 'Socioeconomic factors and student achievement in
Queensland', accessed 18/3/2016,
from http://www.abs.gov.au/AUSSTATS/abs@.nsf/Lookup/
4261.3Main+Features32011
ACARA (2013a) Guide to understanding student gain, accessed 2
March 2016,
from http://www.acara.edu.au/verve/_resources/Guide_to_
understanding_Student_Gain.pdf
ACARA (2013b) 'NAPLAN', accessed 2 March 2016,
from http://www.nap.edu.au/naplan/naplan.html
ACARA (2014a) 'About ICSEA', accessed 2 March 2016,
from http://www.acara.edu.au/verve/_resources/About_ics
ea_2014.pdf
ACARA (2014b) Deidentified student-level NAPLAN data, 2014
results linked to 2012, Australian Curriculum Assessment
and Reporting Authority
ACARA (2015) 'National Assessment Program- Literacy and
Numeracy: National Report for 2015', accessed 2 March
2016,
Grattan Institute 2016
from http://www.nap.edu.au/verve/_resources/2015_NAPL
AN_national_report.pdf
ACARA (2016a) 'Interpreting NAPLAN results',
from http://www.acara.edu.au/verve/_resources/Interpretin
g_NAPLAN_results_file.pdf
ACARA (2016b) 'Standards', accessed 18/3/2016,
from http://www.nap.edu.au/results-and-reports/how-tointerpret/standards.html
AEDC (2016) Australian Early Development Census National
Report 2015: A Snapshot of Early Childhood Development
in Australia, from
https://www.aedc.gov.au/resources/detail/2015-aedcnational-report
Allington, R. L. (2008) 'Why Struggling Readers Continue to
Struggle', in What Really Matters in Response to
Intervention: Research-based Designs, Pearson
Angoff, W. H. (1984) Scales, Norms, and Equivalent Scores,
Educational Testing Service from
https://www.ets.org/Media/Research/pdf/Angoff.Scales.No
rms.Equiv.Scores.pdf
Australian Institute of Health and Welfare (2015) Literature review
of the impact of early childhood
57
Widening gaps: what NAPLAN tells us about student progress
education and care on learning and development: working paper,
from http://aihw.gov.au/workarea/downloadasset.aspx?id=
60129552948
Claessens, A. and Engel, M. (2013) 'How important is where you
start? Early mathematics knowledge and later school
success', Teachers College Record, 115(6), p 1-29
Ball, C., Crichton, S., Templeton, R., Tumen, S., Ota, R. and
MacCormick, C. (2016) Characteristics of Children at
Greater Risk of Poor Outcomes as Adults, The Treasury
(NZ)
from http://www.treasury.govt.nz/publications/researchpolicy/ap/2016/16-01
Cunningham, A. E. and Stanovich, K. E. (1997) 'Early reading
acquisition and its relation to reading experience and
ability 10 years later', Developmental psychology, 33(6), p
934
Barnett, W. S. (2011) 'Effectiveness of Early Educational
Intervention', Science, 333(6045), p 975-978
Department of Education and Training (2016) 'Universal Access
to Early Childhood Education', from
https://www.education.gov.au/universal-access-earlychildhood-education
Cassells, R., Duncan, A., Abello, A., D'Souza, G. and Nepal, B.
(2012) Smart Australians: education and innovation in
Australia, AMP NATSEM
Dougherty, C. and Fleming, S. (2012) Getting Students on Track
to College and Career Readiness: How Many Catch up
from Far behind?, ACT Research Report Series
Ceci, S. J. and Williams, W. M. (1997) 'Schooling, Intelligence,
and Income', American Psychologist, 52(10), p 1051-1058
Education Endowment Foundation (2016) 'Teaching & Learning
Toolkit', accessed 18/3/2016, from
https://educationendowmentfoundation.org.uk/evidence/te
aching-learning-toolkit/
CESE (2014) Value added models for NSW government schools,
Centre for Education Statistics and Evaluation (CESE)
from http://www.cese.nsw.gov.au/images/stories/PDF/Valu
e_Added_paper_July_2014_v2.pdf
CESE (2015) Reading Recovery: A Sector-Wide Analysis, Centre
for Education Statistics and Evaluation
from http://www.cese.nsw.gov.au/images/stories/PDF/Rea
ding_Recovery_Evaluation_FINAL_25112015.pdf
Grattan Institute 2016
French, M. T., Homer, J. F. and Robins, P. K. (2010) What You
Do in High School Matters: High School GPA, Educational
Attainment, and Labour Market Earnings as a Young
Adult, Working Papers, University of Miami, Department
of
Economicsfrom http://www.bus.miami.edu/_assets/files/fac
ulty-and-research/academic-departments/eco/ecoworking-papers/2010/wp-2010-26-what-do-you-do-in-highschool-matters.pdf
58
Widening gaps: what NAPLAN tells us about student progress
Goss, P. (2016) 'Schools crisis comes with massive waste of tax
dollars', accessed 2 March 2016,
from http://grattan.edu.au/news/schools-crisis-comes-withmassive-waste-of-tax-dollars/
Jensen, D. B. (2010) Measuring What Matters: Student Progress,
Grattan Institute from http://grattan.edu.au/wpcontent/uploads/2014/04/016_education_performance_me
asures_report.pdf
Goss, P. and Chisholm, C. (2016) Widening gaps: what NAPLAN
tells us about student progress. Technical Report, Grattan
Institute
Jensen, D. B. and Sonneman, J. (2014) Turning around schools:
it can be done, Grattan Institute
from http://grattan.edu.au/wpcontent/uploads/2014/04/805-turning-around-schools.pdf
Goss, P., Hunter, J., Romanes, D. and Parsonage, H. (2015)
Targeted teaching: How better use of data can improve
student learning, Grattan Institute
Hanson, R. A. and Farrell, D. (1995) 'The long-term effects on
high school seniors of learning to read in kindergarten',
Reading Research Quarterly, p 908-933
Hattie, J. (2015) 'Education State Possible', accessed 2 March
2016, from http://www.aitsl.edu.au/docs/defaultsource/multimedia/education-state-possible-john-hattiearticle.pdf
Houng, B. and Justman, M. (2014) NAPLAN scores as predictors
of access to higher education in Victoria- Melbourne
Institute Working Paper No. 22/14 accessed 2 March
2016, from
https://www.melbourneinstitute.com/downloads/working_p
aper_series/wp2014n22.pdf
Jensen, B. (2010) Investing in our teachers, investing in our
economy, Grattan Institute
Grattan Institute 2016
Lamb, P. S., Jackson, J., Walstab, A. and Huo, D. S. (2015)
Educational opportunity in Australia 2015: Who succeeds
and who misses out, Centre for International Research on
Education Systems, Victoria University, for the Mitchell
Institute from http://www.mitchellinstitute.org.au/wpcontent/uploads/2015/11/Educational-opportunity-inAustralia-2015-Who-succeeds-and-who-misses-out19Nov15.pdf
Leigh, A. (2010) 'Returns to education in Australia', Economic
Papers, 27(3), p 233-249
Leigh, A. and Ryan, C. (2008) 'Estimating returns to education
using different natural experiment techniques', Economics
of Education Review, 27, p 149-160
Main, S. (2013) NAPLAN doesn't stand up to international tests,
The Conversation from
https://theconversation.com/naplan-doesnt-stand-up-tointernational-tests-14765
59
Widening gaps: what NAPLAN tells us about student progress
Masters, G. N. (2005) Against the grade: In search of continuity in
schooling and learning, accessed 2 March 2016,
from http://research.acer.edu.au/monitoring_learning/3
Masters, G. N. (2013) Reforming Educational Assessment:
Imperatives, principles and challenges, 3, Australian
Council for Education Research
from http://research.acer.edu.au/aer/12/
Norton, A. (2012) Graduate Winners: Assessing the public and
private benefits of higher education, Grattan Institute
from http://grattan.edu.au/wpcontent/uploads/2014/04/162_graduate_winners_report.pd
f
O'Donnell, K. and Zill, N. (2006) 'Predicting Early and Later
Reading Achievement of Children from Low-Income
Families from Skills Measured at Kindergarten Entrance',
Annual Meetings of the Population Association of America,
Los Angeles, United States,
from http://paa2006.princeton.edu/papers/61304
OECD (2008) Measuring improvements in learning outcomesBest practices to assess the value-added of schools,
OECD Publishing
OECD (2012a) Does money buy strong performance in PISA, 13,
from http://www.oecd-ilibrary.org/education/does-moneybuy-strong-performance-in-pisa_5k9fhmfzc4xx-en
OECD (2012b) Equity and Quality in Education - Supporting
Disadvantaged Students and Schools, OECD Publishing
Grattan Institute 2016
OECD (2013a) PISA 2012 Results: Excellence Through Equity:
Giving Every Student the Chance to Succeed
(Volume II), PISA from http://dx.doi.org/10.1787/9789264201132en
OECD (2013b) PISA 2012 Results: What Makes Schools
Successful? Resources, policies and practices (Volume
IV), PISA from http://www.oecd.org/pisa/keyfindings/pisa2012-results-volume-IV.pdf
OECD (2014a) PISA 2012 Results: What Students Know and Can
Do (Volume I, Revised edition, February 2014), OECD
Publishing
OECD (2014b) PISA 2012 Results: What Students Know and Can
Do: Annex B1.1 Results (tables): A profile of student
performance in mathematics,
from http://dx.doi.org/10.1787/888932935667
OECD (2014c) PISA 2012 Results: What Students Know and Can
Do; AnnexA1: Indices from the student, school and parent
context questionnaires,
from http://dx.doi.org/10.1787/888932937073
OECD (2015) Universal Basic Skills What Countries Stand to
Gain: What Countries Stand to Gain, OECD
from http://dx.doi.org/10.1787/9789264234833-en
OECD (2016) Low-Performing Students- Why They Fall Behind
and How to Help Them Succeed, OECD Publishing
60
Widening gaps: what NAPLAN tells us about student progress
Pearson (2016) 'Interpretation Problems of Age and Grade
Equivalents', accessed 18/03/2016,
from http://www.pearsonclinical.com/language/RelatedInfo
/interpretation-problems-of-age-and-gradeequivalents.html
Peterson, P. E., Barrows, S. and Gift, T. (2016) After Common
Core, States Set Rigorous Standards, Education Next
from http://educationnext.org/after-common-core-statesset-rigorous-standards/
Sullivan, J. R., Winter, S. M., Sass, D. A. and Svenkerud, N.
(2014) 'Assessing Growth in Young Children: A
Comparison of Raw, Age-Equivalent, and Standard
Scores Using the Peabody Picture Vocabulary Test',
Journal of Research in Childhood Education, 28(2), p 277291
The Smith Family (2016) 'Our Work: Education has the power to
change lives and break the cycle of disadvantage',
accessed 18/3/2016, from
https://www.thesmithfamily.com.au/what-we-do/our-work
Productivity Commission (2012) Schools workforce, accessed 2
March 2016,
from http://www.pc.gov.au/inquiries/completed/educationworkforce-schools/report/schools-workforce.pdf
Thomson, S., De Bortoli, L. and Buckley, S. (2013) PISA 2012:
How Australia measures up, accessed 2 March 2016, from
https://www.acer.edu.au/documents/PISA-2012-Report.pdf
Productivity Commission (2016) 'Education Evidence Base',
accessed 18/3/2016,
from http://www.pc.gov.au/inquiries/current/educationevidence
VCAA (2012) A guide to the Relative Growth Report, Victorian
Curriculum and Assessment Authority
from http://www.vcaa.vic.edu.au/Documents/naplan/Guidet
oRelativeGrowthReport.pdf
Romanes, D. and Hunter, J. (2015) 'Grade repetition: there are
better ways to move kids forward than by holding them
back', accessed 1 March 2013,
from http://grattan.edu.au/news/grade-repetition-there-arebetter-ways-to-move-kids-forward-than-by-holding-themback/
VCAA (2015) Deidentified linked student-level NAPLAN data,
2009 Year 3 cohort, Victorian Curriculum and Assesment
Authority
Stanovich, K. E. (1986) 'Matthew effects in reading: Some
consequences of individual differences in the acquisition of
literacy', Reading research quarterly, p 360-407
Grattan Institute 2016
61