Despite the considerable annual investments of money and school resources to hold the NAPLAN tests, almost no research has sought to investigate patterns of student achievement in the NAPLAN writing test data over time. I wanted to know what the NAPLAN writing test results tell us about male and female student performance over time.
My research study found Year 9 males write at a similar standard as Year 7 females. There has been a rapid decline in student writing scores for both genders, however the gap between male and female writing scores widens with every tested year level, to an equivalent of two years of learning by Year 9.
Most significantly I also found that the NAPLAN writing test’s design and the way we implement the test may be factors that make it difficult to trust the test’s outcomes over time.
Writing is a skill that is basic to the economy, to people’s wellbeing, and to their life trajectory. It underpins our activity and experiences in education, science, governance, law, the economy, religion, and cultural life. Writing is essential for the day-to-day operations of most employees across all global industries and services according to the US National Assessment Governing Board. A person’s success in education, the workplace, and broader society is strongly influenced by their capacity to write.
Lack of research into NAPLAN writing data
To ensure that Australian students are developing adequate skills in writing, reading, language conventions, and mathematics for adult life, the Ministerial Council on Education, Employment, Training and Youth Affairs introduced the National Assessment Program – Literacy and Numeracy (NAPLAN) tests in 2008. Since then, over one million students in Years 3, 5, 7, and 9 completed the tests each year (the test was cancelled in 2020 due to COVID-19). However almost no research has sought to investigate patterns of student achievement in the NAPLAN writing test data over time.
My research
So, what does the last decade of NAPLAN testing tell us about student writing outcomes? My research drew on the NAPLAN results provided by the Australian Curriculum, Assessment and Reporting Authority (ACARA) in annual NAPLAN reports for 2011-2018. According to ACARA (2016a), “in 2016, the narrative prompt was placed onto the existing persuasive writing scale, creating a NAPLAN writing scale comparable for both genres… [meaning] that the results can be compared and trends analysed in NAPLAN writing data from 2011 onwards but not for results before then” (para. 3). For this reason, my research compared male and female student performance on the writing tests between 2011 and 2018.
I also drew on the Grattan Institute’s Equivalent Year Levels approach which calculates student progress using a different method to ACARA and which results in a cohort’s equivalent year level rather than the seemingly arbitrary and difficult to interpret numbers in the NAPLAN reports. For example, a NAPLAN achievement score of 536 would equate to an equivalent year level of 7.5 or halfway through Year 7. A cohort’s equivalent year level can be subtracted from their equivalent year level on the previous NAPLAN test to work out their progress in the two years between NAPLAN tests. If a cohort scored 536 in Year 7 and 548 when tested again in Year 9, they would be performing at the equivalent of a Year 8 standard, making approximately six months of progress in the two years between tests.
I used the NAPLAN achievement scores and the equivalent year level approach to provide the first in-depth picture of how male and female students have performed on the writing test between 2011 and 2018.
Figure 1
Year 3 writing achievement by gender, 2011-2018
Figure 2
Year 5 writing achievement by gender, 2011-2018
Figure 3
Year 7 writing achievement by gender, 2011-2018
Figure 4
Year 9 writing achievement by gender, 2011-2018
My findings reveal a clear gender gap in writing outcomes for all four tested year levels. Year 3 male students’ scores were, on average, the equivalent of 8.16 months of learning behind female scores. The gender gap widened across the year levels, to 11.8 months of learning in Year 5, 20.1 months of learning in Year 7, and 24.1 months of learning in Year 9. Despite a considerable gender gap across all tested year levels, writing achievement declined rapidly for both genders over the selected eight years.
Test modifications make a difference
While these results paint a dismal picture of student progress with writing in a decade of testing, my research highlighted four modifications that ACARA have made to the writing test across the years that make it difficult to tell if the tests have been equally challenging for students.
Modification 1: Text type switching
Between 2011 and 2018, four NAPLAN writing tests required students to write narrative texts (stories), while seven required them to write persuasive texts (arguments). This is problematic because educational linguists have shown for decades that writing narratives involves very different linguistic and structural choices than persuasive writing. Despite this, ACARA have treated the results of all NAPLAN writing tests as directly comparable, despite the focus on either narrative or persuasive writing each year.
Modification 2: Age-appropriate writing prompts
Between 2008 and 2014, students in all year levels responded to one writing prompt each year. Because certain prompts were deemed too challenging for primary students or too simplistic for secondary students, from 2015, ACARA introduced separate, age-appropriate prompts for primary and secondary school students. The move to age-appropriate prompts altered the test conditions, yet scores over time are still treated as directly comparable.
Modification 3: Knowledge of the target genre focus prior to test
From 2008 to 2013, teachers and students were made aware of the genre focus (either narrative or persuasive text) before the test date. Since 2014, ACARA has not revealed the genre focus until the time of the test. The decision to reveal the focus genre at the time of the test aimed to prevent teachers from over-preparing students for one genre of writing; however, knowing the genre prior to the test gave those completing it before 2014 an advantage over those completing it since. Despite this change to the test conditions, ACARA treats all scores as directly comparable.
Modification 4: Shift to online testing
From 2008 to 2017, students completed paper-based writing tests. In 2018, 20% of students completed the test online. The 2018 test results were higher for those who completed it online, but despite this, online test results are directly compared with paper-based results.
Taken together, the modifications made to the NAPLAN writing test raise questions about whether each test has been equally challenging, and therefore whether the decline reported through the NAPLAN annual reports is real.
The future of NAPLAN writing tests
As the future of the NAPLAN writing test is debated, my research highlights two important points. First, any new version of the NAPLAN writing test should be designed and implemented carefully, learning from the current test’s history to avoid the need for modifications that call into question whether scores can be reliably compared over time.
Second, every NAPLAN writing test has found the same concerning gender gap that widens as students progress through school. While comparing NAPLAN writing scores year after year is clearly problematic, any single test on its own can be considered a valid measure of writing achievement for that point in time, so we can say with confidence that the gender gap does exist and widens across the school years.
To understand what is behind the writing gender gap, further research is needed into the personal and environmental factors that influence the writing development of male and female students. If we can understand what is happening, it is more likely we will be able to improve writing outcomes for all students.
Those interested can read more about Rapid decline and gender disparities in the NAPLAN writing data
Damon Thomas is a Senior Lecturer in English Education at the University of Tasmania. His PhD investigated the persuasive writing choices made by primary and secondary school students who scored highly on the NAPLAN writing test and critiqued the test’s design. His research interests include reading and writing development and pedagogy, assessment, social semiotics and theories of persuasive communication.