NAPLAN

Q:Which major party will fully fund public schools? A:None. Here’s what’s happening

You would be forgiven for thinking that policy related to schooling is not a major issue in Australia. In the lead up to the federal election, scant attention has been paid to it during the three leaders’ debates. One of the reasons could be because the education policies of the major parties have largely converged around key issues.

Both Labor and the Coalition are promising to increase funding to schools but neither is prepared to fully fund government schools to the Schooling Resource Standard (SRS).  Under a Coalition government public schools will get up to 95 per cent of the Schooling Resource Standard by 2027, under a Labor government they will get 97 per cent by 2027. Either way we are talking two elections away and to what degree public schools will remain underfunded.

Both the Coalition and Labor plan to fully fund allprivate schools to the Schooling Resource Standard by 2023. Some private schools are already fully funded and many are already over funded

Yes, Labor is promising to put equality and redistribution back on the agenda in areas such as tax reform and childcare policy, but its Fair funding for Australian Schools policy fails to close the funding gap between what government schools get, and what they need.  And yes Labor is promising to put back the $14 billion cut from public schools by the Coalition’s Gonski 2.0 plan and will inject $3.3 billion of that during its 2019-22 term, if elected.

The point I want to make is neither major party is prepared to fully fund government schools to the level that is needed according to the Schooling Resource Standard.

I find this deeply disappointing.

There are certainly differences between Coalition and Labor education policies, the main being that Labor will outspend the Coalition across each education sector from pre-schools to universities.

However, as I see it, neither major party has put forward an education policy platform. Instead, they have presented a clutch of ideas that fail to address key issues of concern in education, such as dismantling the contrived system of school comparison generated by NAPLAN and the MySchool website, and tackling Australia’s massive and growing equity issues.

Both major parties believe that the best mechanism for delivering quality and accountability is by setting and rewarding performance outcomes. This approach shifts responsibility for delivering improvements in the system down the line.

And let’s get to standardised testing. There is a place for standardised tests in education. However, when these tests are misused they have perverse negative consequences including narrowing the curriculum, intensifying residualisation, increasing the amount of time spent on test preparation, and encouraging ‘gaming’ behaviour.

Labor has promised to take a serious look at how to improve the insights from tests like NAPLAN, but this is not sufficient to redress the damage they are doing to the quality of schooling and the schooling experiences of young people.

These tests can be used to identify weaknesses in student achievement on a very narrow range of curriculum outcomes but there are cheaper, more effective and less problematic ways of finding this out. And the tests are specifically designed to produce a range of results, so it is intended for some children to do badly; a fact missed entirely by the mainstream media coverage of NAPLAN results.

National testing, NAPLAN, is supported by both Labor and the Coalition. Both consistently tell us that inequality matters, but both know the children who underperform are more likely to come from communities experiencing hardship and social exclusion. These are the communities whose children attend those schools that neither major party is willing to fund fully to the Schooling Resource Standard.

Consequently, teachers in underfunded government schools are required to do the ‘heavy lifting’ of educating the young people who rely most on schooling to deliver the knowledge and social capital they need to succeed in life.

The performance of students on OECD PISA data along with NAPLAN show the strength of the link between low achievement and socio-economic background in Australia; a stronger link than in many similar economies. This needs to be confronted with proper and fair funding plus redistributive funding on top of that.

A misuse of standardised tests by politicians, inflamed by mainstream media, has resulted in teachers in our public schools being blamed for the persistent low achievement of some groups of children and, by extension, initial teacher education providers being blamed for producing ‘poor quality’ teachers.

There is no educational justification for introducing more tests, such as the Coalition’s proposed Year 1 phonics test. Instead, federal politicians need to give up some of the power that standardised tests have afforded them to intervene in education. They need to step away from constantly using NAPLAN results to steer education for their own political purposes. Instead they need to step up to providing fair funding for all of Australia’s schools.

I believe when the focus is placed strongly on outputs, governments are let ‘off the hook’ for poorly delivering inputs through the redistribution of resources. Improved practices at the local level can indeed help deliver system quality, but not when that system is facing chronic, eternal underfunding.

Here I must comment on Labor’s proposal to establish a  $280 million Evidence Institute for Schools.  Presumably, this is Labor’s response to the Productivity Commission’s recommendation to improve the quality of existing education data. Labor is to be commended for responding to this recommendation. The Coalition is yet to say how they will fund the initiative.

However what Labor is proposing is not what the Productivity Commission recommended. The Commission argued that performance benchmarking and competition between schools alone are insufficient to achieve gains in education outcomes. They proposed a broad ranging approach to improving the national education evidence base, including the evaluation of policies and building an understanding of how to turn what we know works into into common practice on the ground.

Labor claims that its Evidence Institute for Schools will ensure that teachers and parents have access to ‘high quality’ ‘ground breaking’ research, and it will be ‘the right’ research to assist teachers and early educators to refine and improve their practice.

As an educational researcher, I welcome all increases in funding for research but feel compelled to point out according to the report on Excellence in Research for Australia that was recently completed by the Australian Research Council, the vast majority of education research institutions in Australia are already producing educational research assessed to be of or above world class standard.

The problem is not a lack of high quality research, or a lack of the right kind of research. Nor is it the case that teachers do not have access to research to inform their practice. Without a well-considered education platform developed in consultation with key stakeholders, this kind of policy looks like a solution in search of a problem, rather than a welcome and needed response to a genuine educational issue.

Both major parties need to do more to adequately respond to the gap in the education evidence base identified by the Productivity Commission. This includes a systematic evaluation of the effects of education policies, particularly the negative effects of standardised tests.

The people most affected by the unwillingness of the major parties to imagine a better future for Australia’s schools are our young people, the same young people who are demanding action on the climate crisis. They need an education system that will give them the best chance to fix the mess we are leaving them. Until we can fully fund the schools where the majority of them are educated in Australia we are failing them there too.

Dr Debra Hayes is Head of School and Professor, Education & Equity at the Sydney School of Education and Social Work, University of Sydney. She is also the President of the Australian Association for Research in Education. Her next book, co-authored with Craig Campbell, will be available in August – Jean Blackburn: Education Feminism and Social Justice (Monash University Press). @DrDebHayes

NAPLAN is not a system-destroying monster. Here’s why we should keep our national literacy and numeracy tests

Australia’s numeracy and literacy testing across the country in years 3, 7, and 9 is a fairly bog standard literacy and numeracy test. It is also a decent, consistent, reliable, and valid assessment process. I believe the National Assessment Program-Literacy and Numeracy (NAPLAN) is a solid and useful assessment.

Education experts in Australia have carefully designed the testing series. It has good internal consistency among the assessment items. It has been shown to produce consistent results over different time points and is predictive of student achievement outcomes.

However there are special characteristics of NAPLAN that make it a target for criticisms.

Special characteristics of NAPLAN

What is particularly special about NAPLAN is that most students around the country do it at the same time and the results (for schools) are published on the MySchool website. Also, unlike usual in-house Maths and English tests, it was developed largely by the Australian Government (in consultation with education experts), rather than being something that was developed and implemented by schools.

These special characteristics have meant that NAPLAN has been under constant attack since its inception about 10 years ago. The main criticisms are quite concerning.

Main criticisms of NAPLAN

  • NAPLAN causes a major distortion of the curriculum in schools in a bad way.
  • NAPLAN causes serious distress for students, and teachers.
  • NAPLAN results posted on MySchool are inappropriate and are an inaccurate way to judge schools.
  • NAPLAN results are not used to help children learn and grow.
  • NAPLAN results for individual children are associated with a degree of measurement error that makes them difficult to interpret.

The above criticisms have led to calls to scrap the testing altogether. This is a rather drastic suggestion. However, if all the criticisms above were true then it would be hard to deny that this should be a valid consideration.

Missing Evidence

A problem here is that, at present, there does not exist any solid evidence to properly justify and back up any of these criticisms. The Centre for Independent Studies published an unashamedly pro-NAPLAN paper that does a fair job at summarising the lack of current research literature. However, as the CIS has a clearly political agenda, this paper needs to be read with a big pinch of salt.

My Criticisms  

Rather than completely dismissing the criticisms due to lack of evidence, as was done in the CIS paper mentioned above, based on my own research and knowledge of the literature I would revise the criticisms to:

  • In some (at present indeterminate) number of schools some teachers get carried away with over-preparation for NAPLAN, which unnecessarily takes some time away from teaching other important material.
  • NAPLAN causes serious distress for a small minority of students, and teachers.
  • Some people incorrectly interpret NAPLAN results posted on MySchool as a single number that summarises whole school performance. In fact school performance is a multi-faceted concept and NAPLAN is only a single piece of evidence.
  • It is currently unclear to what extent NAPLAN results get used to help children at the individual level as a single piece of evidence for performance within a multi-faceted approach (that is, multiple measurement of multiple things) generally taken by schools.
  • While NAPLAN results are associated with a degree of measurement error, so too are any other assessments, and it is unclear whether NAPLAN measurement error is any greater or less compared to other tests.

I realise my views are not provocative compared with the sensationalized headlines that we constantly see in the news. In my (I believe soberer) view, NAPLAN becomes more like any other literacy and numeracy test rather than some education-system-destroying-monster.

NAPLAN has been going for about 10 years now and yet there is no hard evidence in the research literature for the extreme claims we constantly hear from some academics, politicians, and journalists.

My views on why NAPLN has been so demonised

From talking to educators about NAPLAN, reviewing the literature, and conducting some research myself, it is clear to me that many educators don’t like how NAPLAN results are reported by the media. So I keep asking myself why do people mis-report things about NAPLAN so dramatically? I have given some thought to it and believe it might be because of a simple and very human reason: people like to communicate what they think other people want to hear.

But this led me to question whether people really do interpret the MySchool results in an inappropriate way. There is no solid research that exists to answer this question. I would hypothesize however that when parents are deciding on a school to send their beloved child, they aren’t making that extremely important decision based on a single piece of information. Nor would I expect that even your everyday Australian without kids really thinks that a school’s worth is solely to be judged based on some (often silly) NAPLAN league table published by a news outlet.

I also think that most people who are anti-NAPLAN wouldn’t really believe that is how people judge schools either. Rather, it is more the principle of the matter that is irksome. That the government would be so arrogant as to appear to encourage people to use the data in such a way is hugely offensive to many educators. Therefore, even if deep down educators know that people aren’t silly enough to use the data in such an all-or-none fashion, they are ready to believe in such a notion, as it helps to rationalize resentment towards NAPLAN.

Additionally, the mantra of ‘transparency and accountability’ is irksome to many educators. They do so much more than teach literacy and numeracy (and even more than what is specifically assessed by NAPLAN). The attention provided to NAPLAN draws attention away from all the additional important hard work that is done. The media constantly draws attention to isolated instances of poor NAPLAN results while mostly ignoring all the other, positive, things teachers do.

Also I will point out, schools are already accountable to parents. So, in a way, the government scrutiny and control sends a message to teachers that they cannot be trusted and that the government must keep an eye on them to make sure they are doing the right thing.

I can understand why many educators might be inclined to have an anti-NAPLAN viewpoint. And why they could be very ready to believe in any major criticisms about the testing.

NAPLAN has become the assessment that people love to hate. Therefore the over-exaggerated negative claims about it are not particularly surprising even if, technically, things might not be so bad, or even bad at all.

My experience with the people who run the tests

In the course of carrying out my research I met face-to-face with some of the people running the tests. I wanted to get some insights into their perspective. I tried my best to go into the meeting with an open mind so what I wasn’t anticipating was an impression of weariness. I found myself feeling sorry for them more than anything else. They did not enjoy being perceived as the creepy government officials looking over the fence at naughty schools.

Rather, they communicated a lot of respect for schools and the people that work in them and had a genuine and passionate interest in the state of education in our country. They saw their work as collecting some data that would be helpful to teachers, parents and governments.

They pointed out the MySchool website does not produce league tables. A quote from the MySchool website is: “Simple ‘league tables’ that rank and compare schools with very different student populations can be misleading and are not published on the My School website”.

Personally, I think it is a shame that NAPLAN testing series has not been able to meet its full potential as a useful tool for teachers, parents, schools, researchers and governments ( for tracking students, reporting on progress, providing extra support, researching on assessment, literacy and numeracy issues and allocating resources).

Value of NAPLAN to educational researchers

Where NAPLAN has huge potential, generally not well recognized, is its role in facilitating educational research conducted in schools. Schools are very diverse, with diverse practices. Whereas NAPLAN is a common experience. It is a thread of commonality that can be utilized to conduct and facilitate research across different schools, and across different time points. The NAPLAN testing has huge potential to facilitate new research and understanding into all manner of important factors surrounding assessment and literacy and numeracy issues. We have an opportunity to better map out the dispositional and situational variables that are associated with performance, with test anxiety, and engagement with school. The number of research studies being produced that are making use of NAPLAN is increasing and looks set to continue increasing in the coming years (as long as NAPLAN is still around). There is real potential for some very important research making good use of NAPLAN to come out of Australian universities in the coming years. There is possibility for some really impressive longitudinal research to be done.

Another positive aspect that is not widely recognized but is something mentioned by parents in research I have conducted, is that NAPLAN tests might be useful for creating a sense of familiarity with standardized testing which is helpful for students who sit Year 12 standardized university entrance exams. Without NAPLAN, students would be going into that test experience cold. It makes sense that NAPLAN experience should make the year 12 tests a more familiar experience prior to sitting them, which should help alleviate some anxiety. Although I must acknowledge that this has not received specific research attention yet.

Perhaps focusing on the importance of NAPLAN to research that will benefit schooling (teachers, parents, schools) in Australia might help change the overall narrative around NAPLAN.

However there are definitely political agendas at work here and I would not be surprised if NAPLAN is eventually abandoned if the ‘love to hate it’ mindset continues. So I encourage educators to think for themselves around these issues, and instead of getting caught up in political machinations, if you find yourself accepting big claims about how terrible NAPLAN supposedly is, please ask yourself: Do those claims resonate with me? Or is NAPLAN just one small aspect of what I do? Is it just one single piece of information that I use as part of my work? Would getting rid of NAPLAN really make my job any easier? Or would I instead lose one of the pieces of the puzzle that I can use when helping to understand and teach my students?

If we lose NAPLAN I think we will, as a country, lose something special that helps us better understand our diverse schools and better educate the upcoming generations of Australian students.

 

Dr Shane Rogers is a Lecturer in the School of Arts and Humanities at Edith Cowan University. His recent publications include Parent and teacher perceptions of NAPLAN in a sample of Independent schools in Western Australia in The Australian Educational Researcher online, and he is currently involved in research on What makes a child school ready? Executive functions and self-regulation in pre-primary students.

 

How school principals respond to govt policies on NAPLAN. (Be surprised how some are resisting)

School principals in Australia are increasingly required to find a balance between improving student achievement on measurable outcomes (such as NAPLAN) and focusing energies on things that can’t as easily be measured; such as how well a school teaches creative and critical thinking, how it connects with its local community or how collaboratively teachers on its staff work together.

Governments and systems would expect a school leader to deliver across all of these policy areas, and many others.

It is a significant part of the work of school principals to continually take policies designed to apply to an often-vast number of schools, and find ways to make them work with their specific local community and context. Different policies can often have conflicting influences and pressures on different schools.

This is an issue of ‘policy enactment’. That is, how principals implement, or carry out, policy in their schools. It is of particular interest to me.

Policy Enactment Studies

My research takes up the idea of policy enactment. This approach to studying the effects of policy starts from the idea that school leaders don’t just neatly apply policy as-is to their schools.

Instead, they make a huge number of decisions. They ‘decode’ policy. This involves considering the resources, relationships and local expertise that is available to them. They also consider the local needs of their children, parents, teachers and school community. They consider the ‘histories, traditions,and communities’ that exist in their school.

It is a complex process that takes leadership expertise and requires wide collaboration within a school community and the principal’s network. Research in this area might seek to understand the local conditions that influence principals’ policy enactment processes.

My recent research had a particular focus on how principals enacted school improvement policies. This was a specific push by the Australian Government to improve student outcomes on measures including NAPLAN testing. I wanted to better understand how traditions, histories, and communities (and other factors) influenced the decisions principals made.

How did local contexts, and the things principals and their wider school communities valued, influence what they focused on? How did principals and schools respond to pushes for ‘urgent improvement’ on NAPLAN testing?

Context

The reforms I studied stemmed from the Rudd/Gillard/Rudd government’s ‘Education Revolution’. The education revolution reforms were referred to at the time  by the government as some of the largest-scale reforms in Australia’s recent history. They involved the introduction of NAPLAN testing, the introduction of the MySchool website to enable publication of school data and easier comparison of schools, and spurred on local improvement agendas such as Queensland’s United in our Pursuit of Excellence.

My Case Study

My research involved a longitudinal study that spanned three school years. I worked closely with three public school principals, interviewing them throughout this period, and analysing documents (including school strategic plans, school data, policy documents, and school improvement agenda documents). The principals were all experienced and had been leading their schools for some time. They were seen as high performing principals and were confident in their approaches towards leading their rural and regional schools. One of the principals, ‘Anne’, was particularly interesting because she was emphatic about valuing the things that could not be easily measured on NAPLAN and the other tools being used to measure improvement and achievement.

Shift away from the focus on NAPLAN and other measurement tools

While research has shown the ways testing such as NAPLAN can narrow the focus of education to that which can be measured, Anne emphasised a more holistic view of education. She was able to resist some of the potential narrowing effects of school improvement. She prioritised the arts, musicals, social and interpersonal development, and individual student wellbeing and learning journeys. She had less of a focus on the data being ‘red or green’ on MySchool and focused instead on the distance travelled for her students. She was confident that unlocking student confidence and fostering a love of schooling engaged those students who were less confident in the areas being measured on improvement data – and she articulated the ways their engagement and confidence translated into improved learning outcomes, with school data that supported her comments.

How did the principal shift the school focus away from testing?

So how did she achieve this? My study found two main ways that she managed to resist the more performative influences of school improvement policies. Firstly, the school had a collaboratively-developed school vision that focused on valuing individual students and valuing the aspects of education that can’t be easily measured. The power of the vision was that it served as a filter for all policy enactment decisions made at the school. If it didn’t align with their vision, it didn’t happen. There was also agreement in this vision from the staff, students, and community members, who kept that vision at the forefront of their work with the school.

The second key aspect was that Anne had developed a strong ‘track record’ with her supervisors, and this engendered trust in her judgment as a leader. She was given more autonomy to make her policy enactment decisions as a result, because of this sense of trust. It was developed over a long time in the same school and in the same region before that. To develop her track record, Anne worked hard to comply with departmental requirements (deadlines, paperwork, and other basic compliance requirements). In addition to this, the school’s data remained steady or continued to improve. Anne was emphatic that this was due to the school’s holistic approach to education and their long-term focus on individual learning journeys rather than reacting to data with quick-fixes.

Case study shows a contrast to trends – what can we learn?

This case study worked in contrast to trends of how “teaching to the test” and NAPLAN in particular, is narrowing the school curriculum. This is important because research presented within this blog in the past has shown us how testing regimes can impact on students, can give less precise results than they appear to, and can further marginalise students and communities.

The school pushed for a wider picture of education to be emphasised, resisting some of the possible unintended effects of testing cultures. We can learn some lessons from this case study. It shows us that communities can collaboratively articulate what is important to them, and work together to maintain a focus on that. This shows us one way that schools can enact policy rhetoric about having autonomy to meet local needs and make local decisions.

The case study also shows us the power of a ‘track record’ for principals when they want to enact policies in unexpected or unusual ways. When they are trusted to make decisions to meet their local communities’ needs, the policy rhetoric about leadership and autonomy is further translated into practice.

These are just some of the insights these case studies were able to provide. Other findings related to how school data was guiding principals’ practices, how the work of principals had been reshaped by school improvement policies, and how principals felt an increased sense of pressure in recent years due to the urgency of these reforms.

If you’d like to read more about these issues, please see my paper The Influence of Context on School Improvement Policy Enactment: An Australian Case Study in the International Journal of Leadership in Education.

 

Dr Amanda Heffernan is a lecturer in Leadership in the Faculty of Education at Monash University. Having previously worked as a school principal and principal coach and mentor for Queensland’s Department of Education, Amanda’s key research interests include leadership, social justice, and policy enactment.

Amanda also has research interests in the lives and experiences of academics, including researching into the changing nature of academic work. She can be found on Twitter @chalkhands

 

NAPLAN testing begins for 2018 and here’s what our children think about it

Australia’s national literacy and numeracy testing program, NAPLAN, for 2018 begins today, on Tuesday 15th May. Classrooms have been stripped of all literacy and numeracy charts and posters, and chairs and tables set out for testing. Our news feeds will be full of adults talking about the program, especially what they think is going wrong with it.

I am much more interested in what children think about NAPLAN.

I know from my research that many children do not like the tests and it is not because ‘not many children like taking tests at any time’ as the Australian Curriculum, Assessment and Reporting Authority (ACARA), which oversees the program, has told us.

Sitting tests is just one form of assessment and as such is a normal part of the rhythms and patterns of everyday school life: children go to school and now and then the type of assessment they do is a test.

But to claim NAPLAN is just another test is a simplistic adult perspective. Some children see it very differently.

I asked the children about assessments at school

I asked 105 children in Years 3, 5 and 7, as well as their parents, teachers and principals, about their experiences and views of NAPLAN. While they cannot speak for every child, their accounts give us insights into how children actually experience the tests.

When I spoke to the Year 7 children about which type of assessment they prefer, some favoured assignments, while others explained that ‘I prefer tests because if I get something wrong, I can see where I’ve gone wrong easier’, and ‘if I get a lot wrong it’s easier to talk to the teacher about it’.

So, what is it about NAPLAN that makes it such a negative experience for some children, even for those who normally prefer tests as a form of assessment?

I have written previously about why some children construct NAPLAN as high-stakes, even though it has been designed to be a low-stakes test. However, there are other major differences between NAPLAN and the usual type of school-based tests. There are big differences in the test papers as well the testing protocols, or conditions, under which the tests are taken.

The NAPLAN test papers

NAPLAN’s distinctive format causes confusion for some children which leads to mistakes that are unrelated to the skills being tested. For example, when colouring the bubbles related to gender, one Year 3 girl in my study mistakenly coloured the bubble marked ‘boy’.

Level of difficulty

While some children described NAPLAN as ‘easy’, with some equating ‘easy’ with ‘boring’, others found it difficult; with one Year 3 child saying, ‘People should think if children can do it’. For some children, especially in Year 3, this related to unfamiliar vocabulary, which was clear in their questions, ‘What is roasted?’, ‘what is a journal?’, and ‘what does extract mean?’ during practice tests. Others, particularly in Year 7, found the test difficult because the content was unfamiliar: ‘I got annoyed with some of the questions because I hadn’t heard it before’ and ‘some parts of the maths we had not learned about’.

Feedback

Some children do prefer tests to other types of assessment, as I mentioned before, because they find it easier to talk through their answers with their teachers. However, NAPLAN results are simply indicated by a dot positioned within reported bands for their year level, with no substantive feedback. And the results arrive far too late, months after the testing, to be of use anyway.

The testing conditions

NAPLAN does not only involve the tests themselves, but the conditions under which the children take them. In addition to the change in the teachers’ role from a mentor, or helper, to a supervisor who reads scripted directions, NAPLAN’s testing protocols produce a very different classroom atmosphere to that which would be usual for a class or group test – particularly in primary school.

Isolation

During NAPLAN, the room must be stripped of all displays and stimulus, and the students must sit in isolation so that they cannot talk with other students or see their work. Only the Year 7 children had experience in taking similar extended tests, which raises the issue of NAPLAN’s suitability for younger children. For the children in my study, this isolation was not usually a part of taking school-based tests; they simply completed their tests at their desks which stayed in the usual classroom configuration.

Time

The Year 7 children were also encouraged to read a novel or continue with an assignment when they were finished school-based tests, to give all children enough time to finish. This is a sharp contrast to NAPLAN’s strict testing protocols, where such behavior would be seen as cheating.

Other children found NAPLAN difficult because of insufficient time: ‘I hate being rushed by the clock. When I am being rushed I feel like … I will run out of time which makes it super hard to get it done’ (Year 7 child), and ‘I felt a little worried because I didn’t get a few questions and there wasn’t much time left, so I didn’t know if I was going to do them all’ (Year 3 child).

Test preparation: The spillover from the testing week to everyday school life

These differences between NAPLAN and everyday school life, including school-based tests, mean that many teachers consider test preparation necessary. While most of these teachers did not agree with test preparation, they felt they had little choice, as ‘the kids need to be drilled on how the questions are going to be presented and to fill in the bubbles and all the jargon that goes with that’, and ‘to give them the best chance, to be fair to them’. As a result, the negative effects of the testing week spilled over into everyday school life in the months leading up to the tests; albeit to varying degrees within the different classrooms.

The daily ‘classroom talk’ which helped the children to clarify and refine their understandings was conspicuously absent. The students’ learning context shifted from tasks requiring higher order thinking skills, such as measuring the lengths and angles of shadows at different times during the day; pretending to be reporters to research the history of the local community; or developing and proposing a bill for the Year 7 Parliament; to isolated test practice which included colouring bubbles, ‘because if you don’t do it properly they won’t mark it’.

Some children found this shift frustrating, which affected student-teacher relationships, with some Year 7 children reporting that ‘[she gets] more cranky’ and ‘[he is] more intense’ as NAPLAN approached. For children with psychological disabilities, this shift was particularly difficult; with outbursts and ‘meltdowns’ resulting in negative consequences that deepened their alienation from their teacher and peers.

NAPLAN goes against everything we try to do in class

The separated desks and stripped walls not only make the classroom look different, but feel alien in comparison to the children’s everyday school life. This was reflected in some students’ reports that ‘It’s scary having all our desks split up and our teacher reading from a script and giving us a strict time limit’. This was supported by one of the teachers:

NAPLAN goes against everything we try to do in class. You’re getting the kids to talk to each other and learn from each other, and learn from their peers and challenge their peers, and yet they’ve got to sit on their own, isolated for such a period of time. It’s not even a real-life scenario.

ACARA maintains that the primary purpose of NAPLAN is to ‘identify whether all students have the literacy and numeracy skills and knowledge to provide the critical foundation for other learning and for their productive and rewarding participation in the community’ (ACARA, 2013). Further, that the testing environment must be tightly controlled to ensure that the tests are fair.

However, the issues I found in my research raise critical questions regarding NAPLAN’s ability to achieve the government’s primary goals of: (1) promoting equity and excellence in Australian schools; and (2) for all young Australians to become successful learners, confident and creative individuals and active, informed citizen; as outlined in the Melbourne Declaration of Educational Goals for Young Australians.

Many Year 7 students in my study reported that NAPLAN was a waste of time that hindered their learning; with some children reporting that as a result, they had disengaged from the test and any associated preparation. This raises significant questions about the extent to which NAPLAN can do the job it was designed to do.

As we embark on another year of NAPLAN testing, it is time to rethink the test, and this requires authentic conversations with, rather than about, students and their teachers.

 

Dr Angelique Howell is a casual academic at The University of Queensland. She is working on several research projects relating to students’ engagement in meaningful learning and exploring how young people, schools and communities can work together to enhance student engagement. An experienced primary teacher, her research interests include social justice; counting children and young people in, together with the other stakeholders in educational research; and apprenticing students as co-researchers.

 

Learning to write should not be hijacked by NAPLAN: New research shows what is really going on

You couldn’t miss the headlines and page one stories across Australia recently about the decline of Australian children’s writing skills. The release of results of national tests in literacy and numeracy meant we were treated to a range of colour-coded tables and various info graphics that highlighted ‘successes’ and ‘failures’ and that dire, downward trend. A few reports were quite positive about improved reading scores and an improvement in writing in the early years of schooling. However, most media stories delivered the same grim message that Australian students have a ‘major problem’ with writing.

Of course politicians and media commentators got on board, keen to add their comments about it all. The release of NAPLAN (National Assessment Program – Literacy and Numeracy) every year in Australia offers a great media opportunity for many pundits. Unfortunately the solutions suggested were predictable to educators: more testing, more data-based evidence, more accountability, more direct instruction, more ‘accountability’.

These solutions seem to have become part of ‘common sense’ assumptions around what to do about any perceived problem we have with literacy and numeracy. However, as a group of educators involved in literacy learning, especially writing, we know any ‘problem’ the testing uncovers will be complex. There are no simple solutions. Certainly more testing or more drilling of anything will not help.

What worries us in particular about the media driven responses to the test results is the negative way in which teachers, some school communities and even some students are portrayed. Teachers recognise it as ‘teacher bashing’, with the added ‘bashing’ of certain regions and groups of schools or school students.  This is what we call ‘deficit talk’ and it is incredibly damaging to teachers and school communities, and to the achievement of a quality education for all children and young people.

Providing strong teaching of literacy is an important component of achieving quality outcomes for all students in our schools. There’s little doubt that such outcomes are what all politicians, educators, students and their families want to achieve.

As we are in the process of conducting a large research project into learning to write in the early years of schooling in Australia we decided to have a say. We have a deep understanding of the complexities involved in learning to write. Especially, our research is significant in that it shows teachers should be seen as partners in any solution to a writing ‘problem’ and not as the problem.

Our project is looking at how young children are learning to write as they participate in producing both print and digital texts with a range of tools and technologies. While the project is not complete, our work is already providing a fresh understanding of how the teaching of writing is enacted across schools at this time. We thought we should tell you about it.

What we did

Our research was carried out in two schools situated in low socio-economic communities across two states. The schools were purposefully selected from communities of high poverty that service children from diverse cultural and/or linguistic backgrounds in Australia.  Schools like these often achieve substantially below the national average in writing as measured by NAPLAN. These two schools are beginning to demonstrate that this does not need to be the case.

We looked at how, when, where, with what, and with whom children are learning to write in early childhood classrooms. We want to know what happens when writing, and other text production, is understood to be a collaborative, shared practice rather than an individual task; and when teaching and learning has a focus on print and digital tools, texts resources and devices. We worked collectively with the schools to think about the implications for teaching and learning.

Spending time in these schools has giving us a deeper understanding of how poverty and access to resources impact on student outcomes. We found many positive things, for example the way the teachers, researchers, children, their families and communities work together enthusiastically to plan and implement high quality literacy curriculum and teaching to all students.

As part of our study, we audited the current practices of teaching and learning writing. We interviewed teachers and children to gather their perspectives on what learning to write involves, asking them about when they write, where they write, who they write with and the resources they use when writing. By combining teacher and children’s perspectives, we aim to understand how children learn to write from a variety of perspectives.

What we found (so far)

This is just the first step in sharing the results of our research (there is much more to come) but we thought this was a good time to start telling you about it. It might help with an understanding of what is happening in schools with writing and where NAPLAN results might fit in.

We identified four vital areas. Each is important. This is just an overview, but we think you’ll get the idea.

Teaching skills and time to write

Teachers are indeed teaching basic print-based skills to their students. This is despite what you might be told by the media. What teachers and children have told us is that they need more time to practise writing texts. Our observations and discussions with teachers and children suggest that the current crowded curriculum and the way schools now expect to use a range of bought systems, tools, kits and programs to teach the various syllabuses, is providing less time for children to actually write and produce texts. We believe this has significant implications for how well children write texts.

Technology and writing

We captured the close and networked relationship between texts, technologies, resources and people as young children learn to write. In summary, we believe print-based and digital resources need to come together in writing classrooms rather than be taught and used separately.

Another important point is that there is a problem with equity related to access to technology and digital texts. Children in certain communities and schools have access while those in other communities do not. This is not something teachers can solve. It is a funding issue and only our governments can address it.

Writing as a relational activity

We know that teachers and children understand that learning to write is a relational process. It needs to be a practice that people do together – including in classrooms when the learners and the teacher and other adults work on this together. When asked, children represented themselves as active participants in the writing process. This is a positive outlook to have. They talked about being able to bring their ideas, preferences, and emotions, not just their knowledge of basic skills, to the mix. They represented writing as an enjoyable activity, particularly when they were able to experience success.

Who is helping children to learn to write?

Children saw other children and family members, as well as their teachers, as key human resources they could call upon when learning to write. Children perceived these people as being knowledgeable about writing and as being able to help them. Again this is a positive finding and has many implications for the way we teach writing in our schools, and the way we engage with parents.

We know that learning to write should not be considered an individual pursuit where the goal is to learn sets of composite skills, even if these skills are easy to test. Rather, it is a process where the goal should always be to learn how to produce texts that communicate meaning.

We hope our work can help you to see that learning to write is not a simple process and that any problems encountered won’t have simple solutions.

For schools in communities of poverty, the aim to achieve improvements in how well students write will be impacted upon by a variety of complex social, economic, political and material issues. Teachers do play an important role. However, while teachers are held accountable for student outcomes, so too should systems be held accountable for balancing the policy levers to enable teachers to do their job.

If the latest NAPLAN results mean that standards in writing in Australia are declining (and we won’t go into how that could be contestable) it is unlikely that any of the simple solutions recently offered by media commentary or politicians will help. More testing leading to more box ticking means less time to learn to write and less time to write.

We will have more to tell you about our research into young children learning to write in the future. Watch out for our posts.

————————————————————————–

**The blog is drawn from the ARC funded project, Learning to write: A socio-material analysis of text production (DP150101240 Woods, Comber, & Kervin). In the context of increased calls for improved literacy outcomes, intense curriculum change and the rapidly increasing digitisation of communication, this project explores the changing practices associated with learning to write in contemporary Early Childhood classrooms. We acknowledge the support of the Australian Research Council and our research partners who are the leaders, teachers, children and their families who research with us on this project.

 

Annette Woods is a professor in the Faculty of Education at Queensland University of Technology. She researches and teaches in school reform, literacies, curriculum, pedagogy and assessment. She leads the Learning to write in the early years project (ARC DP150101240).

 

 

Aspa Baroutsis is a senior research fellow in the Faculty of Education at Queensland University of Technology. She is currently working on the Learning to write in the early years project (ARC DP150101240). Her research interests include media, policy, social justice, science education, digital technologies and literacies.

 

 

Lisa Kervin is an associate professor in language and literacy in the Faculty of Social Sciences and a researcher at the Early Start Research Institute at the University of Wollongong. Lisa’s current research interests are focused on young children and how they engage with literate practices. She is a chief investigator on the Learning to write in the early years project (ARC DP150101240).

 

 

Barbara Comber is a professor in education at the University of South Australia. Barbara researches and teaches in literacies, pedagogy and socioeconomic disadvantage. She is a chief investigator on the Learning to write in the early years project (ARC DP150101240).

 

The dark side of NAPLAN: it’s not just a benign ‘snapshot’

The release of the latest NAPLAN results this week identified a problem with student performance in writing. This prompted the federal minister for education, Simon Birmingham, to state these results “are of real concern”. And the CEO of Australian Curriculum, Assessment and Reporting Authority, Robert Randall, added that “we’ll have a conversation with states and territories” to pinpoint the exact problem.

You get the message: there is a problem. As I see it we have a much bigger problem than the one the minister and ACARA are talking about.

At the moment, we have two concurrent and competing ‘systems’ of education operating in Australia, and particularly in NSW: one is the implementation of the state-authorised curriculum and the other, the regime of mass tests which includes NAPLAN and the Higher School Certificate.

The bigger problem

 NAPLAN results get everyone’s attention, not just mainstream media and parents, but also teachers and school communities. Attention is effectively diverted from curriculum implementation. That means that resources, teacher attention and class time is soaked up with attempts to improve the results of under-performing students. It means that the scope and depth of the curriculum is often ignored in favour of drills and activities aimed at improving student test performance.

In a way, this is sadly ironic for NSW, given that new syllabuses rolled out across 2014-2015 have the development of literacy and numeracy skills as two of seven general capabilities. Specific content in these syllabuses has been developed to strengthen and extend student skills in these two areas. 

Before teachers had the chance to fully implement the new syllabuses and assess student learning, the NSW government jumped in and imposed a ‘pre-qualification’ for the HSC: that students would need to achieve a Band 8 in the Year 9 NAPLAN reading, writing and numeracy test. Yet another requirement in the heavily monitored NSW education system.

And if the federal education minister has his way, we’ll see compulsory national testing of phonics for Year 1 students, in addition to the NAPLAN tests administered in Years 3, 5, 7 and 9; and then in NSW, students will have to deal with the monolithic HSC.

So the ongoing and worsening problem for schools will be finding the space for teaching and learning based on the NSW curriculum.

Similar things are happening in other states and territories.

The dark side of national testing

As we know, mass testing has a dark side. Far from being the reasonable, benign ‘snapshot’ of a child’s skills at a point in time, we know that the publication of these tests increase their significance so that they become high-stakes tests, where parental choice of schools, the job security of principals and teachers and school funding are affected.

And here I will add a horror story of how this can be taken to extremes. In Florida in 2003, the Governor, Jeb Bush, called the rating of schools based with a letter A-F, based on test results, a “key innovation”. Using this crude indicator, schools in this US state were subsequently ‘labelled’ in a simplistic approach to numerous complex contextual features such as attendance rates, student work samples, the volume and types of courses offered and extracurricular activities.

Already in Australia NAPLAN results have a tight grip on perceptions of teacher and school effectiveness. And quite understandably, schools are concentrating their efforts in writing on the ‘text types’ prescribed in the NAPLAN tests: imaginative writing – including narrative writing, informative writing and persuasive writing.

So what might be going wrong with writing?

As I see it, the pressure of NAPLAN tests is limiting our approaches to writing by rendering types of writing as prescriptive, squeezing the spontaneity and freshness out of students’ responses. I agree it is important for students to learn about the structural and language features of texts and to understand how language works. However it appears that schools are now drilling students with exercises and activities around structural and language features of text types they’ll encounter in the test.

Has the test, in effect, replaced the curriculum?

Again taking NSW as an example, writing has always been central, dating back over a century to the reforms in both the primary and secondary curriculum in 1905 and 1911 respectively. The then Director of Education, Peter Board, ensured that literature and writing were inextricably linked so that the “moral, spiritual and intellectual value of reading literature” for the individual student was purposeful, active and meaningful. In addition to this, value and attention was assigned to the importance of personal responses to literature.

This kind of thinking was evident in the 1971 NSW junior secondary school English syllabus, led by Graham Little, which emphasised students using language in different contexts for different purposes and audiences. In the current English K-10 Syllabus, the emphasis is on students planning, composing, editing and publishing texts in print or digital forms. These syllabus documents value students engaging with and composing a wide range of texts for imaginative, interpretive and analytical purposes. And not just to pass an externally-imposed test.

In a recent research project with schools in south-west Sydney, participating teachers, like so many talented teachers around Australia, improved student writing skills and strengthened student enjoyment of writing by attending to pedagogical practices, classroom writing routines and strategies through providing students choice in writing topics and forms of writing; implementing a measured and gradated approach to writing; using questioning techniques to engage students in higher order thinking and portraying the teacher as co-writer.

These teachers reviewed the pressures and impact of mass testing on their teaching of writing, and like so many around Australia, looked for ways to develop the broad range of skills, knowledge and understandings necessary for all students, as well as ways to satisfy the accountability demands like NAPLAN.

Without the yoke of constant mass testing I believe teachers would be able to get on with implementing the curriculum and we’d see an improvement not only in writing, but also across the board.

Don Carter is senior lecturer in English Education at the University of Technology Sydney. He has a Bachelor of Arts, a Diploma of Education, Master of Education (Curriculum), Master of Education (Honours) and a PhD in curriculum from the University of Sydney (2013). Don is a former Inspector, English at the Board of Studies, Teaching & Educational Standards and was responsible for a range of projects including the English K-10 Syllabus. He has worked as a head teacher English in both government and non-government schools and was also an ESL consultant for the NSW Department of Education. Don is the secondary schools representative in the Romantic Studies Association of Australasia and has published extensively on a range of issues in English education, including The English Teacher’s Handbook A-Z (Manuel & Carter) and Innovation, Imagination & Creativity: Re-Visioning English in Education (Manuel, Brock, Carter & Sawyer).

This is what primary school children think about NAPLAN

There are no obvious consequences for poor National Assessment Program: Literacy and Numeracy (NAPLAN) performance by individual children. So the notion that children should not be too stressed about doing the tests is not uncommon. However, as I see it, the idea that NAPLAN is a low-stakes test is an adult idea. It imposes an adult perspective on children’s experiences.

Children’s perceptions of what constitutes a consequence of poor test performance may differ to those of adults. This led me to focus my PhD study on exploring how Australia’s National Assessment Program: Literacy and Numeracy (NAPLAN) is experienced by primary school-aged children, with a particular focus on the children’s own reports of their experiences.

My research shows that NAPLAN can be very high-stakes indeed for some children.

What is high-stakes testing, and why does it matter?

Whether tests are defined as high or low-stakes depends on the consequences attached to the scores. While low-stakes tests simply provide information for children, their families and teachers, high-stakes tests have significant consequences for poor test performance. These include holding students back a year or firing teachers whose classes do not achieve set targets.

Supporters of high-stakes tests argue that having consequences attached to test scores motivates otherwise lazy teachers and/or students to work harder and achieve better results. However, research has found that high-stakes testing has unintended consequences, including a negative impact on students.

Is NAPLAN a high-stakes test?

ACARA has consistently claimed that NAPLAN is a low-stakes test because the government does not use results to create league tables to ‘name and shame’ underperforming schools, control grade promotion or close underperforming schools. It is therefore assumed that children will experience the test as low-stakes.

However, high-stakes uses of the data, such as the MySchool website, which at the end of the day represents green for ‘good’ schools and red for ‘bad’ schools, and the negotiation of rewards payments through National Partnerships are steering the test in a high-stakes direction.

Not all children describe NAPLAN as a negative experience, and not all children construct the test as high-stakes. For example:

ist

Children’s constructions of NAPLAN as high-stakes

However, reports of children’s anxiety are common. For some children, this anxiety is mild, for example, ‘before NAPLAN I get little tingles in my stomach. But when I’m in the test the tingles in my stomach go away’.

For other children, this anxiety causes more intense physical responses like shaking:

shaking

It is often argued that anxiety is a normal part of taking any test – this does not make NAPLAN high-stakes for children. However, research has found that children experience greater anxiety about high-stakes tests than classroom tests. This suggests that some children’s anxiety around NAPLAN may be due to their constructions of the test as high-stakes.

Some children worry about score comparisons as they convey that, ‘I don’t want to be below the average’, or ‘I could be ranked low’, with some fearing that they could be judged as foolish:

nervous

Others believe that they will let their families down if they don’t do well. For example, ‘When I thought I was going to fail I thought it may mean I’m failing my family’.

A few children construct serious consequences of failing NAPLAN. Although ACARA has been clear in saying that NAPLAN is not a pass/fail test, some children tell me that ‘I don’t want to fail a subject’.

One Year 3 child I worked with believed that she would be held back a year:

grade-retention

Another told me, ‘when the NAPLAN week was coming up, I kept having ‘after NAPLAN’ dreams, like what would happen if I did really bad … in one of them, I was getting kicked out of the school, which made me feel quite anxious’.

For some children, poor NAPLAN scores mean a future of unemployment and poverty as they believe that, ‘you should try your best to do NAPLAN. Because then you could never ever get a job and get money and maybe couldn’t even get a house!’

Why do some children construct NAPLAN as high-stakes?

The 2010 Senate Inquiry into the Administration and Reporting of NAPLAN Testing found that the government’s poor communication about the purpose of the test has led to confusion, which is intensified by inconsistencies between claims that NAPLAN is a low-stakes test, and high-stakes uses of the data.

These inconsistencies filter down to the school level, with one parent telling me that, ‘they say it isn’t important, but they seem to go out of their way to say how the school performs against state or national averages – which says to me that they kind of do think it’s important but they don’t want to say so explicitly’.

Media narratives around NAPLAN

Research suggests that for some parents, the confusion around NAPLAN’s purpose and importance is resolved through three apparently ‘common-sense’ media narratives around MySchool.

The first of these is distrust, which is reflected in parents’ comments such as, ‘It’s not all about getting A’s and F’s, it’s just to see if your teachers are teaching you correctly’. Some parents also distrust teachers who minimise test preparation to adequately prepare their children for what they believe is a very important test. As a result, one teacher told me, ‘You just hear the talk about how they’ll get them ready – how THEY’LL get them ready’.

The second narrative of choice is seen in the belief of some parents that strong NAPLAN results are important for enrolling their children into their choice of ‘good’ private schools; even though these schools maintain that they ‘do not use NAPLAN results as an admission tool’.

Finally, the narrative of performance is reflected in some parents’ belief that it is important to ‘know how my children are positioned within the school, the state, the nation’. One parent also told me that, ‘If my children were not meeting the required standard, I would take action’; although it wasn’t clear what this action might be.

Lack of consistency leads to confusion

In this emotionally charged and confusing climate, in which some children are positioned within negative parent-teacher relationships as parents and teachers blame each other for children’s anxiety, the children receive little, if any, clear and consistent information about NAPLAN. This leaves children confused about why they do the test, with older children in particular asking, ‘What’s the point of NAPLAN?’

point

In the schools I have worked with, principals and teachers tended to limit conversations around NAPLAN to reduce the focus on the test and thus children’s anxiety.   However, this may unintentionally result in failing to provide children with adequate information about NAPLAN which only adds to their confusion.

What schools can do

Not all children experience NAPLAN in the same way, and not all children’s experiences of the test match what their parents and teachers, even policy makers, believe them to be. With a lack of evidence to the contrary, some children are constructing NAPLAN as high-stakes; with children’s understandings of what constitutes a consequence of poor test performance not necessarily aligning with adult definitions of high-stakes testing.

While schools cannot address issues within the wider community, they can provide children with unambiguous information about the purpose of NAPLAN, which is to ‘identify whether all students have the literacy and numeracy skills that provide the critical foundation for their learning’ (ACARA, 2013). This needs to be communicated to children in language they can understand, and in ways that do not focus excessively on NAPLAN as compared to school based assessments.

This recommendation is supported by research that suggests ‘in schools where tests were carefully explained, the children were more positive about them’. Children should also be provided with opportunities to ask questions about the test and its purpose, with an expectation that their questions will be taken seriously and answered accordingly.

 

Here is my PhD thesis Exploring Children’s Experiences of NAPLAN: Beyond the Cacophony of Adult Debate

 

headshot-photoDr Angelique Howell is a course coordinator in the School of Education at The University of Queensland, and is working on several research projects.  An experienced early childhood/ primary teacher, her research interests include social justice, with a particular focus on counting children and young people in, together with the other stakeholders, in educational research.  She recently published a book chapter entitled, ‘Exploring children’s lived experiences of NAPLAN’ in National Testing in Schools: An Australian Assessment, edited by Bob Lingard, Greg Thompson and Sam Sellar (Routledge, 2016).

NAPLAN and edu-business: the commercialisation of schooling in Australia

NAPLAN testing is orchestrating a high-stakes environment in Australian schools where schools, teachers, students and even parents feel the pressure to perform and do well. Edu-businesses are capitalising on this high-stakes environment for commercial advantage.

Schools and governments now purchase products and services that are explicitly tied to test development and preparation, data analysis and management, remedial services and online content. American academic Patricia Burch, claims that the test industry in the USA is worth $48 billion per year. While it is difficult at this stage to put a precise figure on Australia’s test industry, it is increasingly obvious that the NAPLAN market is rapidly growing.

The NAPLAN market

The NAPLAN market includes practice tests, student workbooks, online programs, tutoring, teacher professional development, data analysis services for schools and so on. For example, the Australian Council for Educational Research (ACER) offers a number of progressive achievement tests (PAT) to provide norm-referenced information to teachers about their students. Schools often purchase a PAT test in Mathematics or English at a cost of $7.50 per student, and subsequently utilises this data to identify their student’s strengths and weakness in preparation for NAPLAN. Similarly, there are online resources like ‘StudyLadder’ or ‘Excel Test Zone’ that offer sample style NAPLAN questions to help students prepare for the test. There are also companies that target the insecurities of parents. Services such as ‘NAPLAN tutor’ offer a membership for $79 that will allow parents to access a range of NAPLAN tutorials. Private tutors also offer NAPLAN specific services.

Some edu-businesses now offer professional development and data analysis services to schools and teachers. For example, ‘Mighty Minds’, ‘Seven Steps’ and ‘Count on Numeracy’ all offer NAPLAN specific workshops. Seven Steps displays the following testimonial on its website: ‘Two of our teachers attended your seven steps seminar last year. They used the program in the Grade 3 cohort. Our NAPLAN results in those two grades were outstanding’. Similarly, Mind Matters suggests that its NAPLAN workshop will ‘focus on revising fundamental skills that are essential for students’ school careers and will prepare them for the NAPLAN test’. This type of marketing capitalises on the anxieties of schools and teachers.

Pearsonisation of NAPLAN

Pearson was among the edu-businesses that were contracted by the States and Territories in their delivery of NAPLAN. In 2012 every State contracted the printing and distribution of the NAPLAN tests to Pearson, with the exception of Queensland who contracted Fuji Xerox for this process. The actual testing of students occurs in schools under the direction of school staff and the subsequent marking of the test is a process overseen by most of the relevant educational authorities in the States and Territories. However, in New South Wales, Victoria and the Australian Capital Territory, this process was also contracted to Pearson, and it became responsible for recruiting, training and paying NAPLAN test markers. For example, this contract is worth $41.6 million in NSW. This presents Pearson as a central agent in the NAPLAN policy network, and moreover, suggests it has significant contractual obligations with Commonwealth, State and Territory governments.

Other areas where edu-businesses is at work in Australia

Edu-businesses are at work elsewhere in Australia. They are also contracted by the Australian Curriculum, Assessment and Reporting Authority (ACARA) in the development of the test and the analysis and reporting of the results on My School. For example, in 2012, ACARA spent over $4 million contracting ACER, Pearson, Educational Measurement Solutions and Educational Assessment Australia for a range of services. Some of these services included item development ($2,075,717), trialling of the test items ($681,253), equating of the test items ($527,848) and analysis and reporting of the results ($610,247).

These increasing amounts of private business activity have caused concern amongst a number of social commentators who believe that education as a public activity, serving the public interest, should remain within the control of the public domain. Yet, the primary aim of involving edu-businesses seems to be to modernise the public sector and make it more effective. This, of course, is based on the assumption that market-oriented management will lead to greater cost efficiency and improved success for governments

Problems with the growing edu-business activity in Australia

NAPLAN clearly represents the emergence of new ‘business opportunities’ in Australian education policy. Edu-business, from multinational corporations like Pearson to smaller national providers such as ACER now contribute to education policy and practice in various ways. In this environment ‘contractualism’ or partnerships between the public and the private sector have become the new normal. ACARA argues edu-businesses are an important and necessary component of developing NAPLAN and similarly, schools and teachers embed products and services from the private sector across all aspects of teaching and learning, particularly in regards to NAPLAN preparation.

My concern is that edu-businesses are increasingly contributing to policy development and teaching and learning practices in ways that have displaced traditional expertise. For example, according to ACARA, NAPLAN is delivered by ‘experts’ across the field. It seems problematic that experts in this case are not teachers, curriculum developers or even university researchers. Instead, experts are constituted by their ability to offer ‘value-for-money’ on competitive tender applications.

Edu-businesses are now closely associated with the role of policymaking and the state. What groups are becoming excluded from, and included in, processes of public policy?

Another concern I have is that the products and services schools and teachers are engaging with in preparation for NAPLAN are often shaped by ‘generalists’ with little classroom experience or formal research background in education. Many of these products are underpinned by agendas of profit making, not evidence.

Similarly, there are potential conflict of interest issues in which edu-businesses like Pearson are contracted to develop aspects of NAPLAN, but also create revenue through marking the NAPLAN test and the selling of resources to improve student’s NAPLAN results.

What can we do?

Of course, some of the work the private sector does is legitimate and important to how we deliver public education effectively. However, if edu-businesses continue to proliferate like they have in recent years, education has the potential to be monopolised by for-profit agendas. We must move beyond the rhetoric of edu-businesses in their promises to transform education and offer solutions to our problems. Instead, we have a responsibility to engage with the private sector more critically and make sure we protect public education and our expertise as deliverers of it.

 

Anna-Hogan4Anna Hogan is a lecturer in the School of Human Movement and Nutrition Sciences at the University of Queensland. Anna has been researching the role of global edu-business on education policy and practice. She is currently working on projects that investigate the privatisation of Australian public schooling, the effects of curriculum outsourcing on teachers’ work and the commercialisation of student health and wellbeing. Anna has recent publications in the Australian Educational Researcher, Journal of Education Policy and Critical Studies in Education.

 

Our obsession with school achievement data is misplaced: we’re measuring the wrong things.

In 2008 Australia began a national assessment program that tests school children in Years 3, 5, 7 and 9 in reading, writing, spelling and numeracy (NAPLAN). These assessments only really entered the national consciousness in 2010 when the Rudd Government launched the My School website, after assuring concerned stakeholders that it would not be possible to directly compare schools and that we would not go down the path of English league tables.

Tell that to The Australian, which has since launched its own website called Your School. The site promises us that we can use it to compare our “own list of schools” online and provides every other media outlet in Australia with the resources to produce its own set of league tables.

Since 2008 and particularly since 2010 we’ve seen a major change in school practice and the national conversation. We’re all familiar with the story of Queensland where Premier Anna Bligh, shocked by her state’s performance in the first NAPLAN, wrote to parents stating that their children would sit practice tests in the lead up to the 2009 assessment.

I was Sydney at the time and blithely unaware of what was happening in QLD but I remembered it as a bold and progressive state: the home of Productive Pedagogies, New Basics, the Inclusive Education Task Force, and a play-based preparatory year firmly grounded in early childhood philosophy.

Upon returning to Brisbane in 2013, I was stunned by how much had changed. My first indication was when my daughter, who was in Year 9, began coming home every week saying “I @#!& hate Thursdays!”

Thursdayitis was a new one for me, but it didn’t take long for me to discover that Thursday, in Term 1 of Year 9 at her school, is “NAPLAN Turbo Day”, where everyone practised NAPLAN type exercises. Not long after that, you might be interested to know I received a form requesting that I sign off on disability support provisions. This was something I refused to do (not sure how much help another 5 minutes would be when the problem is not knowing the answer). Once NAPLAN was over, normal teaching returned and my daughter celebrated never having to do NAPLAN ever again.

But, nothing has surprised me as much as how NAPLAN and our obsession with student/school performance has changed school for school beginners. Being involved in research that looks at the relationships between school practice and disruptive behaviour, I see daily how an intensive focus on literacy and numeracy actually exacerbates the problems of children who start school with early learning and behavioural difficulties.

Researching the behavioural “Tipping Point”

Before returning to QLD, I had developed a longitudinal project based on what boys in special “behaviour” schools had told me about their formative school experiences. I wanted to understand the process of “hard-baking” that they and their principals described: how early learning and behavioural difficulties develop into severe acting out and full-blown hatred of school.

I wanted to learn how that happened and what contributions were made by school practices. By their own accounts, these 33 boys didn’t come to school already hating it. It was a process of attrition and one to which they say schools contribute.

The fact that the majority of these teenage boys nominated the early years (K-2) as the time when they began disliking school, that they received very little support, and that many had received their first suspension during this period told me that the very beginning of school was where the research needed to start.

The project, which has been seed-funded by the Financial Markets Foundation for Children, commenced in 2014 with 250 QLD prep children. As I was a research fellow and we didn’t have a huge amount of funding, I began spending a lot of time in prep classrooms.

Our research aimed to establish a baseline of what children bring to school so that we could track school liking, learning, language, relationships, attitudes and behaviour over time. This battery of assessments immediately highlighted that there is a very wide range between children, even within the same class.

Whilst we chose similar schools (all in disadvantaged areas of south eastern QLD and all with ICSEA scores one standard deviation below the mean) we had some children scoring well above age equivalence and others well below.

It was also very clear that the children who came to school well behind others were in real danger of remaining that way because their teachers either did not have the time to address the needs across their classroom or because the teacher did not have the skills to correctly identify those needs.

And, from my vantage point, NAPLAN definitely exacerbates this. The pressure to have children up to speed by Year 3 reaches down through the early years of school (even to Kindergarten or Prep) where the focus is heavily skewed towards literacy and numeracy. Of course reading, writing and numeracy are vitally important but our research is finding that some other very important things are being crowded out in the process.

What could be more important than the “3 R’s”?

The most important component of quality learning that is now under threat is time to establish warm and positive teacher-student relationships. The more frantic the classroom, the more focused teachers are on the business of “learning”, the more superficial and fraught the relationships. And this is a problem, particularly for children with early learning and behavioural difficulties for whom those relationships might actually make the difference between engagement and disengagement.

My behaviour school research has taught me that the process of “hard-baking”  is fuelled by resentment. There comes a point where pissed off young people decide that they’re going to get a bit of their own back, regardless of what it might cost them in the long run. Once they reach that stage, it is very hard to turn them around.

Building warm and positive relationships costs nothing in the scheme of things and little actions – regular conversation, a pat on the back, a smile, some extra help and a bit of recognition – could very well save connection to schooling for our most vulnerable students.

As my research teams prepare for the National Summit on Student Engagement, Learning and Behaviour being hosted at QUT this week, I have had much cause to think about the central message that I want the Summit to impart.

It’s this: if we have to measure things in order for them to matter, measure student-teacher relationships, school liking and school avoidance.

Find out how students feel about the place where they spend such a large part of their day and the strength of the relationships they have with the second most important adults in their lives.

This is something worth working on but it will never happen if all we measure and if all that counts are ABCs and 123s.

 

GrahambigLinda Graham is Principal Research Fellow in the Faculty of Education, Queensland University of Technology (QUT). She is the Lead Chief Investigator of two longitudinal research projects focusing on disruptive behaviour. One examines the experiences of students enrolled in NSW government “behaviour” schools (Australian Research Council DP110103093), and another is tracking the language, learning, experiences, relationships, attitudes and behaviour of 250 QLD prep children through the early years of school (Financial Markets Foundation for Children FMF4C-2013). In 2014, she was elected Editor of the Australian Educational Researcher (AER) and serves as a member of the Australian Association for Research in Education (AARE) Executive Committee. The 2 day National Summit on Student Engagement, Learning and Behaviour  begins Wednesday 8th of July.