More Amazing Secrets of Band Six (part two ongoing until they fix the wretched thing)


When Simon Crook wrote The Amazing Secrets of band six last year for AARE, I had no idea it would become one of the all-time best read posts of EduResearch Matters (now number 15 out of nearly 500, with a spike during the HSC results period). Those of you who read Amazing Secrets last year will have been familiar with the important points raised in the last few days in the Sydney Morning Herald regarding Band 6s and measuring HSC success [1], [2], [3], [4]. With any luck, through the SMH, these issues will have a much wider audience and may provide incentive and leverage to key stakeholders to do something about the current state of play.

A Quick Recap

NSW is obsessed with HSC performance, particularly Band 6s. Every year, the SMH, Telegraph and other media outlets publish school ranks determined by numbers of Band 6s. The SMH also publishes the Honour Roll of those students who achieved Band 6 in each of their subjects. Yet it has already been shown you cannot compare Band 6s between different subjects, so you cannot tally total Band 6s and make a fair comparison between schools or students. In fact, some lower bands in more rigorous subjects actually contribute more to ATAR than Band 6s in less rigorous subjects. 

As previously described, the standards-based ‘Band Description’ model for the HSC was never designed for comparison between subjects. One of the creators and custodians of the HSC, Professor James Tognolini, reiterated last week that: 

“for better or worse there was no attempt to make the standards equivalent when the system was set up … in most subjects there was no attempt to align a band 6 performance in one subject with the band 6 performance in another. The purpose was to report what it is students know and can do, not make comparisons across subjects.” 

In response to Professor Tognolini’s 50/50 choice, the situation is for the worse. Whatever the original intentions, most of society assumes they are equivalent, that a ‘Band 6 is a Band 6’. The whole media, parental choice and school marketing system perpetuates this flawed metric of comparison. It is tempting to blame the media, and the SMH in particular, for their role in this mess, but they are only reporting what they are allowed to report. As I pointed out last year, and as the SMH articles highlighted, more and better comparative and value-add (growth) data should be reported to provide a fairer narrative of both school and student achievement. The CSNSW paper the SMH references makes some good suggestions in this regard. These include several possible alternative measures that could be published including:

  • Non-HSC data, such as vocational education completion rates and post-school outcomes 
  • Median ATAR (or a suitable proxy for scaled marks)
  • Growth or ‘value-add’ (as suggested last year)
  • Band distributions, “which better show the range of achievements within schools, and any shifts over time”.

In order for this to happen, someone high up needs to provide the requisite permission. 

But the issue is not solely about which school performance data can be published in the media. 

It is also time to start seriously talking about improving the HSC as a whole. I’m not talking about getting rid of the HSC, or even a massive overhaul of the assessment, but evolving it in line with the education landscape in NSW in 2022+, rather than continuing with the same model devised last millenium.

A new education landscape of accountability

In the past twenty odd years, the status of the HSC has evolved from the local NSW matriculation qualification affecting university entry to an incredibly high-stakes commodity that can make or break a school/principal/teacher/student. NSW government high schools are now accountable to the School Success Model with targets for increased Band 5 & 6s. Some of these school targets in particularly challenging local contexts are unlikely to be reached, setting schools and individual subjects up to fail, or unduly influencing their educational offerings (see Detrimental Effects below). Many non-government schools and school systems have similar blanket accountabilities and targets which are again setting up certain locally challenged schools and subjects to fail. The HSC was never designed to be used this way, so it must evolve accordingly.

Detrimental Effects 

While the NSW HSC is a strong, established credential of quality assessment for NSW school leavers, over time, one particular well-intended design feature has produced counterproductive consequences. These consequences are detrimental to teachers and students, particularly in critically important HSC subjects. Furthermore, these subjects are key to the Australian economy, for example, the sciences and technical and vocational (STEM) subjects. The particular design feature of concern is the inconsistency and application of the HSC performance ‘Band Descriptions’ for different subjects.

There is an extreme variation in the proportions of students allocated to each of the performance bands in different subjects. For example, in 2021

This is NOT a fair go for all. As can be seen, under the current system the science, technology and vocational subjects are essentially discriminated against. Despite this extreme variation, the band percentages are used as the primary measure of student and school achievement, including in merit lists and strategic targets. Thus bands have become the key driver of detrimental effects to teaching and learning:

  • Warped student subject choice: ‘able’ students are increasingly choosing (or being forced into) subjects with increased access to Band 6s, thereby prioritising access to Band 6s over academic rigour. This in turn negatively impacts future pathways, particularly for diverse cohorts, including  female representation.
  • Reduced school subject offerings: many schools are axing critical subjects and skewing their strategic directions for hiring and investing in subjects/faculties due to gaming the system towards more Band 6s. This is further exacerbated and even intrinsically encouraged by the worsening skilled teacher shortages in e.g. mathematics and the sciences
  • Accountabilities tied to Band 6s (see A new education landscape of accountability above)
  • Teacher performance measurement tied to Band 6s: blanket targets and teacher performance measures can have a devastatingly negative impact upon staff teaching subjects with low proportions in Band 6, contributing to the widely reported teacher shortage and retention problems in critical subjects, poor well-being and depleted morale, particularly with the existential threats of ‘dud ministers’

As mentioned, the use of bands in this way was never part of the design remit for the new HSC in 2000. But over the years the performance bands have evolved into high-stakes features. High-stakes indicators must be strong, reliable and valid. The variation in Band Descriptions, and the proportions of students allocated to each band across subjects means they are no longer reliable or valid as high-stakes performance indicators. They must be open to scrutiny and reform. 

Evolving the HSC

There is one primary way to evolve the HSC: by strategic reforms to the bands. Reforming the bands needn’t be extensive, expensive, or threaten the HSC standards approach, or the ATAR. Bands could still allow for disciplinary differences, but with improved comparability and fairness. Myself and a loose band of academics and researchers have considered models that could be much simpler and cheaper than the current arrangements, yet strengthen the reliability and validity of bands as educational indicators. As a side benefit, they could also improve clarity on standards and exemplar material in the ‘Standards Packages’ to directly strengthen teaching and learning. We are currently making representations to key stakeholders to outline the details of these reforms. 

We have used our collective expertise and have developed possible pathways to reform bands and sustain the HSC into the future. Such reforms would counter the detrimental consequences of current arrangements, mitigate emerging risks and ensure that the HSC remains a strong credential for the next generation of students in NSW. We need a fair go for all; it would be un-Australian to be otherwise. 

Dr Simon Crook is director of CrookED Science, a STEM education consultancy, and Honorary Associate at the School of Physics, University of Sydney. He works with primary and high school teachers and students around many aspects of science and STEM education, and assists the Sydney University Physics Education Research (SUPER) group with their work, including liaising with NESA regarding science syllabuses. His PhD research evaluated the impact of technology on student attainment in the sciences. Previously, Simon was a high school physics teacher.

The amazing secrets of band six (and what you should know)

Part two of this story was published in March 2022.

New South Wales, Day 1, Term 1. A whole staff meeting to begin the school year. At some point after the Principal’s address, the Leader of Learning (or similar) starts their Powerpoint to go through the previous year’s HSC results subject by subject, particularly the number of Band 6s. In non-selective high schools, invariably the English, Humanities, Visual Arts, Music, PDHPE and even Maths teachers are patting themselves on the back and being lauded by the school leadership. This is in complete contrast to the responses from and directed towards the science teachers. Equally, when principals have their HSC performance review meetings with their own superiors, quite often it’s a tricky conversation regarding the ‘performance’ of the sciences.

What is the obsession with Band 6s? Band 6s sound elite, the very best. But the facts are that a Band 4 or 5 in a difficult subject such as Physics or Chemistry may make as big – or even bigger – contribution to ATAR (Australian Tertiary Admission Rank) (more on that later)  than a Band 6 in say, Music. Also, Band 6s are the only metric made publicly available and shared with the media.

Band 6s and exam results (raw and moderated, see Raw Results and Band 6s discussed later) from NESA are not to be confused (but are so often confused) with ATAR scores which are calculated by the Universities Admissions Centre (UAC). The ATAR is a rank, not a mark, indicating a student’s position relative to all the students in their age group in their state. Calculating the ATAR is based on an aggregate of ‘scaled’ marks of a student’s courses. This is totally different from ‘moderating’ by NESA in NSW (see later). Importantly (and often detrimentally) for teachers, only students are measured by ATARs; teachers are measured by Number of Band 6s.  So students in subjects that scale well such as Physics will receive a good ATAR contribution if they perform reasonably well in that subject. But ‘reasonably well’, say high Band 4 to Band 5, doesn’t cut it for teachers measured by their Number of Band 6s. It is also interesting to note that last year a Band 1 (a fail!) in Physics could rank higher than a Band 4 in Visual Arts and Band 3 in Legal Studies, Ancient History, Business Studies and PDHPE.

Last millenium in the UK, I was fortunate enough to achieve two thirds of my A-Level Physics class attaining ‘A’ grades in a non-selective Government school. In NSW, it took me 4 years just to achieve just one Band 6 in two non-selective schools. For the past few years in my work supporting schools in all sectors in the sciences, I regularly spend much of Term 1 advocating for science teachers, coordinators and principals who are feeling the heat for “poor Band 6 results”. I am constantly witnessing (and have suffered first-hand) the negative impacts of these judgements on teacher and principal morale and well-being. Student well-being is also being detrimentally impacted by similar unfair judgements and subject comparisons in Year 11 and HSC Trial (raw) exam results.

And what is the cause of all of this anxiety? The completely flawed metric of comparison that is the ‘Number of Band 6s’. So why is this overly blunt measure, that appears in all school marketing literature, on school billboards, in the Sydney Morning Herald, and is part of NSW vernacular, so flawed as a point of comparison? The answer is that, importantly, it was never intended to be a point of comparison in the first place, particularly between subjects.

Standards-based Assessment

The NSW HSC is a standards-based assessment. The whole construct of NSW HSC standards-based assessment was devised by Dr John Bennett, former Chief Executive of the Office of the Board of Studies NSW (now NESA), as part of his PhD thesis. Dr Bennett’s supervisor was Professor Jim Tognolini, now Director of the Centre for Educational Measurement and Assessment at The University of Sydney, who has been senior advisor on educational measurement issues for every state and territory education department and examination board including NESA. The whole premise of the standards-based model is to maintain the integrity and consistency of measuring standards year on year for an individual subject i.e. that a Band 6 in Chemistry in 2021 is comparable with a Band 6 in Chemistry in 2020; with no mandate or mechanism to say that Band 6 is comparable between subjects.

In this standards-based model, each subject had its own set of ‘Band Descriptors’ (now called Band Descriptions) providing descriptions of typical student performance for each of Bands 2-6. These respective Band Descriptors were devised by respective subject experts. They provide guidance for marking school based assessments (although this raises an issue discussed later), but most importantly, the Band Descriptors provide the standards for subject-specific ‘Judge Markers’ to measure student HSC examination responses against. This means that the Band Descriptors for say PDHPE were devised by experienced PDHPE teachers and judged annually by experienced PDHPE teachers. Similarly, the same can be said for Physics, or any subject. What this means is that the standards are different for each subject. If the Band Descriptors are different for each and every subject, and they are interpreted/judged differently in each subject, then we cannot use ‘Number of Band 6s’ by way of comparison between subjects. It stands to logic – it is simply comparing apples with oranges. 

Comparing Subjects

Following the release of the 2020 HSC results, in a quote in the SMH, Professor Tognolini reiterated

“we’ve never convinced the community that a band 6 in physics was not designed to be the same as a band 6 in biology or a band 6 in chemistry”

(This example is somewhat ironic since in the new science syllabuses every science subject has the same Band Descriptions, but the general point is being made by one of the original designers of the HSC itself). In the same article, Dr Timothy Wright, former Headmaster of Shore stated:

“it is really hard to get a Band 6 in say Chemistry and easier in say Business Studies”.

A few days prior, also in the SMH, a NESA spokesperson is quoted as saying:

“[the] number of Band 6s achieved in science courses can’t be compared with the number achieved in other courses”.

Consider the following examples comparing Band Descriptions between subjects:

Band 4

Business Studies

“demonstrates knowledge and some understanding of business functions and operations” 

Biology, Chemistry, Physics, Earth & Environmental Science, Investigating Science:

“demonstrates sound knowledge and understanding of scientific concepts”

How is “some understanding” comparable to “sound understanding”? It is not.

Band 6

Business Studies:

“demonstrates comprehensive knowledge and understanding of business  functions and operations”

Biology, Chemistry, Physics, Earth & Environmental Science, Investigating Science:

“demonstrates an extensive knowledge and understanding of scientific concepts, including complex and abstract ideas”

Again, there is much greater rigour in the supposed equivalent Band Description in the sciences compared to the non-science. 

Band 5

Perhaps most telling of all is where the Band Description for a typical Band 5 student in any of the 2 unit science subjects is:

“applies knowledge and information to unfamiliar situations…”.

Applying to ‘unfamiliar’ situations doesn’t appear in any other subject apart from Mathematics (and that is only for Band 6). If a student merely learns all of the content of a science, they cannot get above a Band 4 unless they can also apply their skills to unfamiliar situations. Whereas this is not the case in any other subject.

Equity of Access to Band 6s?

In 2017, I attempted to publish an article in The Conversation entitled Battle of the Bands: HSC Physics and Chemistry bottom of the Band 6 charts (co-authored with my PhD supervisor and co-supervisor). The article was refined, approved by editors and ready to go, only to be pulled at the 11th hour for external reasons. The article looked at data for 25% of the State to determine the rate of access to Band 6s among all HSC subjects in high schools in NSW. What was important about our analysis was that rather than compare blunt total numbers of Band 6s (which are readily available on the NESA website), we made a ‘common-cohort comparison’ i.e. what was the relative access to Band 6s of individual students in one subject when compared with themselves in the same other subjects? 

The findings were staggering. Students in Physics and Chemistry (in non-selective schools) were only only 26% and 27% as likely to achieve a Band 6 respectively as they were in the average of their other subjects. By way of comparison, students in PDHPE, Community & Family Studies and Society & Culture were twice (200%) as likely, and Music 1 and Design & Technology were two-and-a-half (250%) as likely to achieve a Band 6 than in the average of their other subjects. In extremis, this was a tenfold or one order of magnitude difference! That is hardly equitable access to Band 6s. Our findings confirmed what science teachers have been reporting for years, that even though the most able students often studied Physics and Chemistry, with relatively low numbers of Band 6s awarded for the State in total, combined with the over-representation of selective schools in these subjects, non-selective schools were left fighting over scraps in terms of access to Band 6s. Even in a school where say the performance of Physics is well above the State average, it is still destined to be below average compared to the other subjects in the same school (by definition, some subjects (usually about half) have to be below average when compared against each other in the same school, yet we still persist with this type of in-house comparison).

Gaming the System

So if you can’t compare Band 6s, and it is more difficult to get Band 6s in some subjects than others, yet schools are still being measured by their numbers of Band 6s, what can be done?

A genuine, yet morally wrong, short-term solution to maximise Band 6s is to guide students away from subjects with a low frequency of Band 6s. We know this happens already with subjects like English Standard: even though many students are better suited to English Standard, many schools push them into the higher Band 6 frequency English Advanced course. If this strategy is applied to the sciences, then schools simply stop offering the sciences. This is happening already in some quarters, not least with the compounding issue of the shortage of science teachers, let alone science-trained teachers. In the short term, this could genuinely address some of the shortfall of Band 6s in a school, but it is only a short-term solution. If a school stops offering any of the sciences, particularly the traditionally ‘rigorous’ ones such as Chemistry and Physics, then the school will ‘residualise’ as aspirational families reject such a school and attend elsewhere offering the full complement of sciences.

Raw Results and Band 6s

Further confusion and anxiety reign with many schools, students and parents misunderstanding raw exam results and any correlation with performance bands. In every HSC exam, a student’s raw exam mark is internally moderated by NESA by subject, based on the Judges’ interpretation of that year’s exam inline with that subject’s Band Descriptors. For example, in one particular subject, a raw HSC exam mark of 76% might be moderated up to a 90 i.e. Band 6, and a raw exam mark of only 18% might be moderated up to a 50 i.e. a Band 2. Another subject might have 93% moderated to 90 (Band 6) and 52% moderated to 50 (Band 2). However as mentioned, many people, particularly students, parents and sometimes school leadership, don’t understand this. They think that a raw exam mark directly and equally translates to a band i.e. that a raw exam mark of 90+ is needed for a Band 6 in any subject. Following the example above, a Band 6 performing student in the first subject, with a raw mark of 76 in their Y11 exams or Y12 Trials, may incorrectly think they are only operating at Band 4, and the adults around them may equally think so. A statistically more commonplace example might be a student only achieving a 46% raw mark in the first subject in a school exam and interpreting that as a fail, whereas the same score may scrape a Band 4 when moderated in the HSC. This misunderstanding can lead to undue anxiety, misplaced self-deprecation and self-efficacy, students dropping the wrong subjects, and yet again, flawed comparisons between subjects.

Where to from here with ‘Number of Band 6s’?

So comparing the ‘Number of Band 6s’ between subjects is completely untenable. Does this knowledge help principals? Right now, not in the slightest. They don’t need me telling them the pressure they are under for Band 6s. What we need is for all stakeholders to spell this out publicly and abide by no longer comparing subjects (and ideally schools) by the numbers of Band 6s. This has to start with media outlets who are responsible for publishing such league tables and contributing to this statewide obsession and very parochial NSW vernacular in the form of ‘Number of Band 6s’. Along with the media, we need NESA (not just anonymous spokespersons), all school sectors, principals’ associations, parent bodies, teacher associations and universities to formally declare and abide by not publicly publishing, advertising or comparing between subjects (and schools) using Number of Band 6s. Only by formally denigrating ‘Number of Band 6s’ can we get to a point where we have “convinced the community”. Discussing this with journalists associated with this blog, “it’s the only metric we have. We’ve asked many times for this to change. More diverse data would stop the league tables.”

Performance Measurement

However, schools and school systems still need to measure performance. Moving beyond Number of Band 6s should not be a problem. During these COVID times, we have finessed our metrics from blunt, not so useful ‘Number of Cases’ to more pertinent measurements such as ‘Cases in the Community’. With 75,000+ students annually sitting HSC exams, there is more than enough data to measure statistically significant ‘value add’ performance of every school, subject and even teacher (if class sizes are large enough). Mathematically, this is achieved by ‘multiple regression analysis’, controlling for all other variables such as gender, socioeconomic status, school type etc. (see an example of multiple regression analysis here). Such in-house data sets are already in use: in the Department of Education with Scout, in Catholic schools with the CSNSW HSC Analysis Project, and in independent schools with various analysts. In Victoria, there has been a long established effort to celebrate value-add through the Schools that Excel lists, though what is suggested above would be far more finessed. 

If the expectation is there for public comparison of all, the data is there if all schools are willing to share. But there’s the catch. With privacy of individual student information laws, and copyright of data, no detailed comparison as suggested above of the whole state i.e. all three sectors, can be published publicly unless everyone signs up to it, which is unlikely. But does that really matter? 

Publishable Performance Metrics

Stakeholders need to decide, are they willing to share all of their data so that true ‘value add’ measurements (with error margins) can be reported fairly? Or is everyone happy to have league tables of school rank only by subject, taking into account all bands (Number of Band 6s alone is no longer an option)? Or do we even need public ranking/comparison? The first option as mentioned is unlikely to be agreed to, plus would require a level of statistical numeracy across society that doesn’t exist, as evident in dealing with COVID. The second option is completely achievable, schools could be ranked within individual subjects. This would eradicate the current inaccurate comparison between subjects, but would continue to perpetuate the anxiety induced by comparison between schools. So ultimately, do we even need to publicly publish relative performance metrics at all if they are essentially meaningless and harmful, or can we just keep these in-house to help monitor progress and improve our individual education of students? Either way, we must stop comparing subjects using the Number of Band 6s.

Dr Simon Crook is director of CrookED Science, a STEM education consultancy, and Honorary Associate at the School of Physics, The University of Sydney. He works with primary and high school teachers and students around many aspects of science and STEM education, and assists the Sydney University Physics Education Research (SUPER) group with their work, including liaising with NESA regarding science syllabuses. His PhD research evaluated the impact of technology on student attainment in the sciences. Previously, Simon was a high school physics teacher for 15 years.

Direct link between teaching and learning with laptops and better HSC results in biology, chemistry and physics

Most Australian students in years 9 to 12 were provided with a laptop courtesy of The Digital Education Revolution between 2008 and 2013. There was a lot of comment at the time about how the use of laptops might influence student learning and what that influence might be. I was particularly interested in the possible impact on the experiences and achievements of high school science teachers and students.
In 2010, I embarked on a six-year study involving 16 Sydney Catholic high schools in NSW to gather evidence. I have to say my expectations at first were quite conservative. I predicted my research would get a null result, as the data would be too inconsistent and messy.

The most interesting finding

However, the results were surprising and quite clear, with the statistical significance and positive effect sizes that boffins wanting “evidence” so crave. The major finding of my research was that teaching and learning with 1:1 laptops was directly linked with students attaining better results in their HSC in biology, chemistry and physics. In most of the previous research in this area only evidence of generic qualities, such as increased motivation or engagement, had been found. My research actually provided hard numbers. Given the high stakes nature of HSC exams in NSW, these findings might be of interest to other teachers of senior students.

Biggest impact in physics, why?

Investigating further I found that 1:1 laptops had a bigger positive impact in physics than in biology and chemistry. The reasons for this seemed related. Physics teachers and students out-reported their peers in the other subjects in terms of using science specific applications e.g. simulations, science software and spreadsheets. They were using applications that would directly benefit the teaching and learning e.g. simulations for experiments that would be impossible to do otherwise. Digging further, this is not surprising as the physics syllabus encourages and even mandates the use of technology throughout the syllabus, whereas in say biology, there is no reference apart from some generic motherhood statements.

I am not claiming in any way that the physics teachers were better than the biology teachers with using technology (they may or may not be, I didn’t explore this), but that the physics teachers had a mandate to use technology and they did, whereas the biology teachers didn’t have the same obligations, so they did not.

Other findings

Students became more proactive

Even if teachers didn’t engage with the technology (a minority), the students would still do so of their own accord. Given that they had a laptop, it appears they really wanted to use it. Also, students were much more inclined to use more creative applications such as blogging, video editing and podcasting than their teachers.

Old practices continued

However, in contrast, while I observed that students moved away from using pen and paper and did more work on their laptops, they still took notes and worked from textbooks, as they did before they had their laptops. The only difference was they now used word processing for notes and electronic textbooks plus simple online searching. Essentially, the laptops were most commonly being used to perpetuate traditional practices. It must be understood however that these findings were from 2010 data, only one or two years into the DER. The question now should be what are the modal practices with technology in 2017?

Teachers had ‘fingers on the pulse’

Another interesting finding was regarding teachers’ perceptions of what students were doing on the laptops compared to what the students reported themselves. About one third of teachers very much had their fingers on the pulse and were quite aware of what their students were doing. Just over half had a medium sense of their students’ practices. One in six teachers appeared to be out of tune with their students’ practices.

Teacher case studies

The final findings were based on case studies of four science teachers. Not surprisingly, I found that different teachers started from different positions of use of and expertise with technology. However, over the years of the study, all teachers reported improvement in their use of and expertise with teaching with the laptops, especially those that were starting from the lowest baseline.

A shift in the power dynamics of the classroom

The most interesting finding from the teacher case studies was that the implementation of the laptops involved a renegotiation of the power dynamics of the classroom and a shift in the teachers’ role from traditional instructor to facilitator of independent learning.

All of the teachers involved reported a gradual relaxing of ‘control’ over time, trusting and collaborating with the students more, and allowing the students to take more of a lead in how to make best use of the technology.

Future impacts

Five years since the end of the DER (such is the nature of part-time research), I feel the findings of this study still have currency for today’s schools. Whatever the latest iteration of technology in schools, or indeed any new initiative, this research raises areas of consideration for future classroom practices and research.

Teachers need to have their fingers on the pulse of their students’ practices. If teachers and students use technologies to capitalise on the unique opportunities they provide, rather than as a gimmick, it has been demonstrated that teaching and learning will improve. Hopefully, this research will further encourage research into new initiatives to include a more quantitative analysis and measurements of improvement or lack thereof.

I would strongly advocate that teachers are consulted on their personal thoughts and experiences in advance to any new initiatives implemented by governments and administrations, based on my research, and that these are monitored over the course of the implementation.

Impact on new syllabuses

In NSW as in many other states and territories, new syllabuses are being written in light of the new(ish) Australian Curriculum. It is quite pertinent that syllabus writers take into consideration the latest research regarding their influence on teaching and learning practices. New syllabuses should encourage the use of and capitalise on technologies that have been demonstrated to benefit teaching and learning. Empty motherhoods statements or catering for the lowest common denominator are not good enough – contemporary syllabuses should be relevant to a contemporary world and evidence-based.

Throughout the time of the DER and in the years since, it has been the subject of persistent criticism, particularly within the right-wing media (it was a Labor initiative after all). However, as with any initiative, while there are often failings, there are also many successes.

In the post-truth world we now find ourselves, we could all benefit from looking at the evidence rather than just react to the constant flow of opinion and comment in the media.


Simon Crook has just completed his PhD in Physics Education Research at the University of Sydney. Producing a ‘thesis by publication’, most of his academic journal articles are already in the public domain. Professionally, Simon is a STEM education consultant with his company CrookED Science. He supports primary and secondary schools and school systems across Australia, providing professional development to teachers and working with students. Previously, he was a high school science teacher for 15 years in the UK and Sydney and eLearning Adviser for the Catholic Education Office Sydney for 6 years. You can find him on  Twitter @simoncrook and check out his website

This article is about the findings from my recently published PhD thesis entitled Evaluating the Impact of 1:1 Laptops on High School Science Students and Teachers, completed through the Physics Education Research group at the University of Sydney.

Secondary schooling in Australia needs to change: throw out the tests and bring in deep learning

There is a problem in some Australian secondary schools right now.  ‘Endgame’ assessments such as the Higher School Certificate (HSC) in NSW and the requirements of an Australian Tertiary Admission Rank (ATAR) to gain entrance to university, place restrictions on the kinds of teaching and learning that goes on in classrooms. Some teachers are frustrated that this ‘current game’ of secondary school is the only one that can be played.

So, alternative models of secondary schooling are becoming regular topics of conversation, debate and disquiet in the world of education. Just look at an education discussion on Twitter or go to an education conference or TeachMeet and you will hear the lament: there has to be a better way.

The desire for something else

At a few education forums in past months both here and overseas I have invited audiences of principals, system leaders, teachers, students and in one case, parents, to consider reimagining high school.

I tell them I believe teachers and students might be better served by teaching that is not ‘high stakes’ focused.  Could we dare to move away from the rigid systems we currently impose?

When I deliver such ideas people gasp and clap. But no rotten tomatoes are thrown. Afterwards, delegates email me to share: “high schools are not serving many adolescents well” or “we could focus on learning” type messages. I believe there is gathering momentum and a mood for change. There is unrest in the education ranks.

How could secondary schooling change?

Educators are talking a lot more about the type of skills mentioned in the Australian Curriculum general capabilities, such as critical and creative thinking, ICT capability, personal and social capability, and ethical and intercultural understanding as well as literacy and numeracy across subject areas. As I see it, these are quite a few of the ‘necessary skills’; the ‘grit skills’, the ‘growth skills’, the ‘public good skills’ = making ‘a good life skills’ for young people.

In the early years at some high schools, teachers and whole year groups are doing week-long interdisciplinary assessments. These are not just brief end-of year tasks but deep learning opportunities that include real-world projects, significant design challenges and creative exercises to enrich and create a vision of schooling that is able to better to inform, critique and question a ‘post truth’ society.

Let’s agree, what we are doing is not working

We saw it with the most recent announcement of international maths and science comparisons. Now the 2016 PISA results are out and Australia has fallen further down ‘the global assessment gradient’. All the usual ‘click bait’, ministerial cries, glib talk-back radio, hand wringing and finger pointing radiated out across the country. It is because we have a problem with schools/principals/teachers/parents/teacher educators … we will need more checks, frequent tests, new assessments. And now a commercial business is to develop the PISA 2018 Student Assessment 21st Century Frameworks for the OECD.

However, as Australia’s Chief Scientist, Alan Finkel says, “… do I take these findings seriously? Yes, I do.”

Well folks, guess what, the current model in Australian secondary schools is not working.

Yes there are whole industries who support same old same old (government policies, think-tank reports, the current political climate, boards of study, coaching schools, instruction makers, publishing houses and education research). Changing things would not be easy. Also the altruistic nature of teaching means that as long as final ‘high stakes’ assessments are valued in secondary schooling, teachers won’t compromise their students by considering more student-centred pedagogies. There is a lot more talk needed about all of that.

But, just maybe, as another education year draws to a close, it is time to #rethinkhighschool. Seriously.

Dr Jane Hunter is an education researcher in the School of Education, Faculty of Arts & Social Sciences at the University of Technology Sydney. She is conducting a series of STEM studies in Australian schools; in this work she leads teachers, school principals, students and communities to better understand and support education change. Her book ”Technology Integration and High Possibility Classrooms: Building from TPACK” is advancing new ways of enacting pedagogy in K-12 schools. Jane was a classroom teacher, and she has received national and international teaching awards for outstanding contributions to student learning. She enjoys writing and her research-based presentations at national and international conferences challenge audiences to consider alternate education possibilities. You can follow her on Twitter @janehunter01