ATAR

No honour in the honour roll

This is part three of an ongoing commentary begun in 2021 and continued in 2022.

The current HSC grading model is unfair –  well past its use-by-date.  It was adopted at the turn of the millennium, way before recent HSC students were even born. 

Now, many schools are gaming the system to maximise the percentage of Band 6 results and consequent rankings. That’s to the detriment of the availability, uptake and performance in STEM subjects. While I present a lot of doom and gloom stemming from my research (with some silent partners), I also offer some easy solutions that will even save time and money.

The doom and gloom

Every year, the Sydney Morning Herald releases its ‘HSC Honour Roll’ of the Merit List of every NSW student with a top Band 6 in a subject (‘Distinguished Achievers’). It includes the top students in a course (‘Top Achievers’) and the NSW Premier’s List of ‘All Rounders’ who achieved Band 6 in 5+ subjects. 

The Honour Roll figures then feed into the notorious high school ranking league tables. But this marketer’s dream is being manipulated and weaponised in the highly competitive high school education industry, to the detriment of STEM subjects in particular. 

Firstly, students studying vocational education subjects such as Electrotechnology are ineligible for the All Rounders accolade. 

More fundamentally, while the ATAR scales subjects like the sciences favourably, the HSC is stacked against students studying a science.

Before you look at my graph below, please note the following:

There are four unequal quadrants in the graph:

  • Top Left: relatively difficult as a subject, with only a low fraction of students awarded Band 6 = only Chemistry, Physics, Science Extension and Economics
  • Top Right: relatively difficult as a subject, but with a high fraction of students awarded Band 6 = French Extension, plus Maths Extension 1 & 2 and Latin Continuers which are off the scale
  • Bottom Left: relatively easy as a subject, but with only a low fraction of students awarded Band 6 = subjects such as Ancient History, Business Studies, Investigating Science, PDHPE, Community & Family Studies, Food Tech and more
  • Bottom Right: relatively easy as a subject, and with a high fraction of students awarded Band 6 = most of the languages (most of which are off the scale), plus subjects such as Dance, Drama, Music (1, 2 and Extension), Textiles & Design, and Visual Arts 

This graph shows ‘difficulty’ against ‘percentage Band 6’ for every subject that contributed to all Honour Roll awards in 2023. 

As a proxy for difficulty, I used the Universities Admissions Centre (UAC) scaled score for an HSC score of 90 in each subject. UAC scales subjects, essentially according to difficulty, in order to determine student ATARs. In HSC, an overall mark of 90 in a subject is the baseline for a Band 6. English Advanced is used as a baseline for comparison since all students have to study English in some form and English has to contribute to a student ATAR. 

English Standard awards little to none

Notice English Standard awards little to no Band 6, which is an issue in itself – some schools game the Honour Roll and ranking system by only letting their students study English Advanced, even if they’re better suited to English Standard. As UAC states “since NESA places English Studies, English Standard and English Advanced raw marks on a common scale, these courses are combined and scaled as a single course but are reported as separate courses in order to be consistent with NESA’s reporting practice“.

As can be seen on the graph, if a student wants the best chance of being a Distinguished Achiever or All Rounder, then they should study multiple languages, and creative and performing arts. 

These subjects award both an unreasonable proportion of Band 6s, and are deemed relatively easy by UAC (hence they are relatively poorly scaled). It begs the question – what is the purpose of an exam that awards 40 per cent or more of its students the top performance band? It also begs the question – why are so many resources being expended to run so many languages with such small candidatures when the exams aren’t fit for purpose. What do I mean? They do not differentiate within their cohorts and there is “a cost of assessing so many languages and also the problems of validity when the enrolments in many languages are so small“.

Why bother?

When it comes to the sciences, why bother studying Chemistry, Physics and Science Extension (or Economics for that matter)? They are so much more difficult, yet the reward of a Band 6, the metric by which to make the Honour Roll and how schools are measured, is so unlikely? 

This has been a status quo of many years and is naturally having a devastating effect on the sciences:

  • Numbers in Chemistry have dropped to their lowest in a decade 
  • Many schools can’t attract enough students for example Physics to run the course, or even steer them away so they don’t have to offer the subject (an awful, cynical strategy to deal with the specialist teacher shortage and low chances of Band 6) 
  • Accordingly, many of the best science students are choosing not to study the sciences or are even being pressured away from them into other subjects
  • Quite often, the Dux of a school is a different student to the All Rounder since the Dux studied e.g. Biology, ‘only’ achieving a high Band 5, negating their chance of being and All Rounder, but achieving the highest ATAR in the school, whereas the All Rounder chose subjects with easier access to Band 6 but was awarded a lower ATAR
  • (There has also been a general decline in the number and diversity of Economics enrolments as highlighted by the RBA)

Of course, there are caveats: able students are capable in most subjects. But which subjects should they choose? Teenagers with self-doubt are opting out of subjects they fear they will ‘fail’ in – girls in STEM anyone?

An Easy Solution

The disparity in percentage Band 6 between the subjects is due to the current HSC being nominally a ‘standards-based’ assessment. I say nominally because the standards are different for every subject, and measured differently by respective subject experts. It is for this reason that UAC completely ignores HSC bands when calculating ATARs. 

The current standards-based model is 23 years old and consequently the system has been gamed over time.  It is no longer fit for purpose. Someone in NESA told me that the recent Curriculum Review, and the consequent Curriculum Reform, should really have been a Curriculum AND ASSESSMENT Review and Reform, to address just this issue. That was a missed opportunity, but there is still time, not least with the new syllabuses coming out in the next few years. 

Instead, I propose a norm-based approach for EVERY subject e.g.:

  • Band 6 = the top 15% of students
  • Band 5 = 35-15% 
  • Band 4 = 60-35% 
  • Band 3 = 80-60% 
  • Band 2 = 95-80% 
  • Band 1 = the bottom 5%

Perversely, while Band 1, a fail, usually only accounts for a few percent, it is currently reported at HSC as marks 0-49, thereby concertina-ing 95+% of students into marks 50-100. Scraping a Band 2 is reported in HSC as a mark of 50, a psychological pass, yet may have only been a raw exam mark of 30%. 

A pointless exercise

This is a pointless exercise to appease parents and employers, even though the students would have likely scored very low (≪50%) marks for two years. This arguably should be changed too, but is less important than fixing the bands.

Rather than the unfair yet non-random scatter gun that is currently Graph 1 (and has caused the editor all kinds of headaches), we would instead have a vertical line of dots since every subject would have an equal 15% of Band 6; UAC would merely differentiate the relative difficulty as they do currently i.e. only NESA needs to change here. 

This simple solution would make for fair comparisons between subjects and greater transparency. Students would have less to worry about when making subject selections. This solution would remove a lot of the gaming of the current system. It would also be a lot cheaper and quicker to run since the expensive judging process against standards would be removed. (Ironically, the current expensive judging process is sometimes disregarded if the statistics don’t match up with what the powers that be wish to be published – what a demoralising waste of money, time and expertise). Give it a couple of years, judging and marking by humans will be superfluous anyway, as cheaper, more accurate AI takes over the task.

Precise profiles

Also, the actual standards for every subject could become more meaningful and specific since we would know the precise profiles of, for example, what a Band 6 looks like in a subject, without meaningless generic terms like ‘extensive’. They could be generated accordingly with meaningful subject specific detail. Further, ‘standards packages’ of student samples of work by band, by subject would be more easily compiled. Only minor, less onerous monitoring over time would be required to ensure that standards were being maintained (rather than massaging the figures to maintain the ‘integrity’ of longitudinal data as occurs occasionally).

Ultimately, the Honour Roll would have more honour – every subject would have the same percentage of students in the Distinguished Achievers list; an All Rounder would be in the top 15% in 5+ subjects; and there would be a fair go for all of meeting the Premier and receiving awards, whether they chose a science or not.

Modified approaches

Perhaps modified approaches might need to be made for small candidature subjects such as languages. Then again, there are much bigger issues to consider with languages. Equally, in subjects where perfect marks are quite possible by many students e.g. Maths and Music, a more nuanced profile may be required. Then again, they could write more difficult exams.

Another improvement which could be adopted by NSW is to follow Victoria’s lead and actively report on ‘most improved’ schools. That removes the focus on the highest achieving (usually the highest socioeconomic) schools, and gives credit for value adding and improvement. (But NSW should ensure it maintains its greater level of exam security compared to Victoria).

There should be a fair go for all in this country. We constantly hear about the need for a STEM-skilled workforce, yet we undermine this constantly at high school level. A simple fix to the HSC would go a long way to encouraging the best young Australians of the viability to study – and subsequently work in – the high-need STEM fields, which are crucial to our economy and progress.

Postscript

Despite the title, this article is by no means suggesting we abandon the HSC in favour of, for example, the IB; we just need to fix the awarding of bands to be more meaningful and equitable across the subjects. Neither am I suggesting we abandon the HSC mark altogether to rely solely on the ATAR. However, I certainly feel that the ATAR should remain, despite some moves to abandon it, not least to keep the HSC in check. This is not yet another EduResearch Matters rant about NESA (see primary science and the arts); the standards-based HSC model was ahead of its time, but that was a LONG time ago. It is well overdue for an overhaul for the reasons stated. Solutions are proffered for consideration.

Thank you to Graham Wright for collating some of the data.

Dr Simon Crook is director of CrookED Science, a STEM education consultancy, and Honorary Associate at the School of Physics, University of Sydney. He works with primary and high school teachers and students around many aspects of science and STEM education, and assists the Sydney University Physics Education Research (SUPER) group with their work, including liaising with NESA regarding science syllabuses. His PhD research evaluated the impact of technology on student attainment in the sciences. Previously, Simon was a high school physics teacher.

What we now know about the other ways to get into university

How do under-represented individuals use alternative pathways to university, and does this lead to student success?

Participation in higher education is on the rise worldwide. This has meant more support for some who traditionally did not attend higher education. One way that has helped has been introducing new pathways to enter higher education, other than the traditional route from secondary education. These pathways include students entering using a VET qualification, or transferring in from another higher education course. They could enter via an enabling program, where students complete a study program before their course, covering important academic skills, such as referencing.  Another route, often used by international students, is undertaking a Diploma with a pathway provider then transitioning into the second year of a degree course. There are other ways to enter, such as portfolio pathways where students access university based on accrued skills, knowledge and experience, or a professional qualification.  These can be popular with mature-age students. There are also access schemes for under-represented groups, such as regional/remote or Indigenous students, and mature-age entry provision which involves completing the Special Tertiary Admissions Test (STAT). 

Not much is known about alternative entry pathways into university. For example, we don’t know how many students have entered through these different routes over time, and how well used the pathways are by under-represented individuals. Also, we don’t know whether those entering via these different pathways do as well academically as those who enter via secondary education.

The study design

Our study focused on seven groups of under-represented students. These were: i) Indigenous students; ii) students with disability; iii) students from a low socioeconomic (SES) background; iv) students from regional and remote areas; v) students from non-English speaking backgrounds (NESB); vi) women in Science, Technology, Engineering and Mathematics (STEM) courses; and vii) mature age students (those aged 25 years and above).

We used a combination of data sources to look into these: i) administrative data on entry pathways from the Higher Education Student Data Collection, sourced from the Department of Education, Skills and Employment, and linked data from unit records of Bachelor degree domestic students and the Student Experience Survey provided by the data offices of 16 participating universities in Australia.

Trends in alternative pathways

Secondary education was the dominant pathway at the start of our data series in 2011, with just over half of university students in Australia entering this way. Over the course of the next nine years, however, enrolment via the secondary education gradually declined to 45% in 2019. Contrary to this, higher education course transfers and VET/TAFE award completions gained popularity, increasing from 22% to 24% (higher education courses) and 12% to 13% (VET/TAFE courses) from 2011 to 2019. The group of ‘other’ pathways (pathway providers, enabling programs, access schemes, portfolio entry) increased the most over this time period, from 9% in 2011 to 14% in 2019. Interestingly, the use of mature-age entry provision halved from 6% to 3%, while the use of professional qualification to access university remained stagnant at around half a percent.

Alternative pathways were important to under-represented groups trying to access university. In fact, over half of all under-represented groups accessed university via alternative pathways. More than three-quarters of Indigenous students and nine out of ten mature-age students came through alternative routes. The use of alternative pathways by under-represented groups rose between 2011 and 2019. For example, for students of low SES backgrounds, this rose from 56% in 2011 to 62% in 2019, and for regional and remote students, this rose from 60% to 52% over the same period. We did find that Indigenous students’ use of alternative entry pathways fell from 79% to 75%, perhaps reflecting improvements in school achievement over the ten years. Alternative entry still remains, however, the most popular way for Indigenous students to enter university.

Student outcomes by university entry pathway

When we compared the academic performance of students from the various alternatives pathways to students coming directly from secondary education, we found some interesting results. Generally, students from alternative pathways had poorer academic outcomes. They were less likely to stay on after their full first-year of university study, or complete their course than those entering directly from secondary education. Students that performed the worst came through the VET and mature-age entry pathways, being the least likely to complete their course  compared to students from secondary education. Interestingly, students from pathway providers or enabling programs actually had stronger rates of retention and completion than secondary school entrants.

We also looked at the marks that students from different pathways achieved, in their first year and over their whole course. We found similar results to retention and course completion: those entering from VET and mature-age entry provisions achieved poorer academic results over their course, although less so in their first year.

What does this all mean?

Our study highlighted how entry pathway can make a difference to how well students do at university, both in terms of the marks they achieve, and whether they complete their course. Some alternative entry pathways – namely enabling programs and pathway providers – did well compared to the traditional way of entering university, via secondary education. Others, notably VET qualification and mature-age entry provision (including STAT), didn’t fare well against secondary education entry. Given alternative entry pathways are on the rise, and this growth is unlikely to slow given the ongoing push for widening participation, higher education institutions need to think seriously about how to better support students coming in through these diverse routes. Furthermore, students from under-represented groups (or ‘equity’ groups) are highly represented in alternative entry pathways, further bolstering the need for support.

Additional academic support may help alternative entrants to achieve better marks, particularly mature-age students and those coming from VET, both perhaps building practical experience and lacking exposure to academic skill development. Strategies to increase alternative entrants’ sense of belonging in higher education may assist with retention, and clearly this needs to extend beyond the first year of study. Given their positive results, strategies to expand enabling programs and upscale entry through extending current and building new pathway partnerships appear sensible strategies to widen participation.   

Ian Li is an economist based at the School of Population and Global Health, The University of Western Australia. He is interested in applied fields of health and labour economics, particularly on research on the determinants of well-being, economic evaluation of healthcare, graduate outcomes and higher education policy equity. Ian is a member of the UWA Academic Board, the Equity and Participation Working Group, and director of the Public Health undergraduate major. He is an editorial board member of the Journal of Higher Education Policy and Management and a co-editor of the Australian Journal of Labour Economics.

Denise Jackson is the director of Work-Integrated Learning (WIL) in the ECU School of Business and Law and researches student employability and career prospects through embedding meaningful work-based learning and industry and community engagement into the curriculum, as well as providing access to a range of employability-related activities. She sits on the National Board for the Australian Collaborative Education Network, the professional association for WIL in Australia, and maintains close links with industry through research projects, the WIL program and networking. She is also a Principal Fellow of the Higher Education Academy.

More Amazing Secrets of Band Six (part two ongoing until they fix the wretched thing)

EDITOR’S NOTE: 

When Simon Crook wrote The Amazing Secrets of band six last year for AARE, I had no idea it would become one of the all-time best read posts of EduResearch Matters (now number 15 out of nearly 500, with a spike during the HSC results period). Those of you who read Amazing Secrets last year will have been familiar with the important points raised in the last few days in the Sydney Morning Herald regarding Band 6s and measuring HSC success [1], [2], [3], [4]. With any luck, through the SMH, these issues will have a much wider audience and may provide incentive and leverage to key stakeholders to do something about the current state of play.

A Quick Recap

NSW is obsessed with HSC performance, particularly Band 6s. Every year, the SMH, Telegraph and other media outlets publish school ranks determined by numbers of Band 6s. The SMH also publishes the Honour Roll of those students who achieved Band 6 in each of their subjects. Yet it has already been shown you cannot compare Band 6s between different subjects, so you cannot tally total Band 6s and make a fair comparison between schools or students. In fact, some lower bands in more rigorous subjects actually contribute more to ATAR than Band 6s in less rigorous subjects. 

As previously described, the standards-based ‘Band Description’ model for the HSC was never designed for comparison between subjects. One of the creators and custodians of the HSC, Professor James Tognolini, reiterated last week that: 

“for better or worse there was no attempt to make the standards equivalent when the system was set up … in most subjects there was no attempt to align a band 6 performance in one subject with the band 6 performance in another. The purpose was to report what it is students know and can do, not make comparisons across subjects.” 

In response to Professor Tognolini’s 50/50 choice, the situation is for the worse. Whatever the original intentions, most of society assumes they are equivalent, that a ‘Band 6 is a Band 6’. The whole media, parental choice and school marketing system perpetuates this flawed metric of comparison. It is tempting to blame the media, and the SMH in particular, for their role in this mess, but they are only reporting what they are allowed to report. As I pointed out last year, and as the SMH articles highlighted, more and better comparative and value-add (growth) data should be reported to provide a fairer narrative of both school and student achievement. The CSNSW paper the SMH references makes some good suggestions in this regard. These include several possible alternative measures that could be published including:

  • Non-HSC data, such as vocational education completion rates and post-school outcomes 
  • Median ATAR (or a suitable proxy for scaled marks)
  • Growth or ‘value-add’ (as suggested last year)
  • Band distributions, “which better show the range of achievements within schools, and any shifts over time”.

In order for this to happen, someone high up needs to provide the requisite permission. 

But the issue is not solely about which school performance data can be published in the media. 

It is also time to start seriously talking about improving the HSC as a whole. I’m not talking about getting rid of the HSC, or even a massive overhaul of the assessment, but evolving it in line with the education landscape in NSW in 2022+, rather than continuing with the same model devised last millenium.

A new education landscape of accountability

In the past twenty odd years, the status of the HSC has evolved from the local NSW matriculation qualification affecting university entry to an incredibly high-stakes commodity that can make or break a school/principal/teacher/student. NSW government high schools are now accountable to the School Success Model with targets for increased Band 5 & 6s. Some of these school targets in particularly challenging local contexts are unlikely to be reached, setting schools and individual subjects up to fail, or unduly influencing their educational offerings (see Detrimental Effects below). Many non-government schools and school systems have similar blanket accountabilities and targets which are again setting up certain locally challenged schools and subjects to fail. The HSC was never designed to be used this way, so it must evolve accordingly.

Detrimental Effects 

While the NSW HSC is a strong, established credential of quality assessment for NSW school leavers, over time, one particular well-intended design feature has produced counterproductive consequences. These consequences are detrimental to teachers and students, particularly in critically important HSC subjects. Furthermore, these subjects are key to the Australian economy, for example, the sciences and technical and vocational (STEM) subjects. The particular design feature of concern is the inconsistency and application of the HSC performance ‘Band Descriptions’ for different subjects.

There is an extreme variation in the proportions of students allocated to each of the performance bands in different subjects. For example, in 2021

This is NOT a fair go for all. As can be seen, under the current system the science, technology and vocational subjects are essentially discriminated against. Despite this extreme variation, the band percentages are used as the primary measure of student and school achievement, including in merit lists and strategic targets. Thus bands have become the key driver of detrimental effects to teaching and learning:

  • Warped student subject choice: ‘able’ students are increasingly choosing (or being forced into) subjects with increased access to Band 6s, thereby prioritising access to Band 6s over academic rigour. This in turn negatively impacts future pathways, particularly for diverse cohorts, including  female representation.
  • Reduced school subject offerings: many schools are axing critical subjects and skewing their strategic directions for hiring and investing in subjects/faculties due to gaming the system towards more Band 6s. This is further exacerbated and even intrinsically encouraged by the worsening skilled teacher shortages in e.g. mathematics and the sciences
  • Accountabilities tied to Band 6s (see A new education landscape of accountability above)
  • Teacher performance measurement tied to Band 6s: blanket targets and teacher performance measures can have a devastatingly negative impact upon staff teaching subjects with low proportions in Band 6, contributing to the widely reported teacher shortage and retention problems in critical subjects, poor well-being and depleted morale, particularly with the existential threats of ‘dud ministers’

As mentioned, the use of bands in this way was never part of the design remit for the new HSC in 2000. But over the years the performance bands have evolved into high-stakes features. High-stakes indicators must be strong, reliable and valid. The variation in Band Descriptions, and the proportions of students allocated to each band across subjects means they are no longer reliable or valid as high-stakes performance indicators. They must be open to scrutiny and reform. 

Evolving the HSC

There is one primary way to evolve the HSC: by strategic reforms to the bands. Reforming the bands needn’t be extensive, expensive, or threaten the HSC standards approach, or the ATAR. Bands could still allow for disciplinary differences, but with improved comparability and fairness. Myself and a loose band of academics and researchers have considered models that could be much simpler and cheaper than the current arrangements, yet strengthen the reliability and validity of bands as educational indicators. As a side benefit, they could also improve clarity on standards and exemplar material in the ‘Standards Packages’ to directly strengthen teaching and learning. We are currently making representations to key stakeholders to outline the details of these reforms. 

We have used our collective expertise and have developed possible pathways to reform bands and sustain the HSC into the future. Such reforms would counter the detrimental consequences of current arrangements, mitigate emerging risks and ensure that the HSC remains a strong credential for the next generation of students in NSW. We need a fair go for all; it would be un-Australian to be otherwise. 

Dr Simon Crook is director of CrookED Science, a STEM education consultancy, and Honorary Associate at the School of Physics, University of Sydney. He works with primary and high school teachers and students around many aspects of science and STEM education, and assists the Sydney University Physics Education Research (SUPER) group with their work, including liaising with NESA regarding science syllabuses. His PhD research evaluated the impact of technology on student attainment in the sciences. Previously, Simon was a high school physics teacher.

The amazing secrets of band six (and what you should know)

Part two of this story was published in March 2022.

New South Wales, Day 1, Term 1. A whole staff meeting to begin the school year. At some point after the Principal’s address, the Leader of Learning (or similar) starts their Powerpoint to go through the previous year’s HSC results subject by subject, particularly the number of Band 6s. In non-selective high schools, invariably the English, Humanities, Visual Arts, Music, PDHPE and even Maths teachers are patting themselves on the back and being lauded by the school leadership. This is in complete contrast to the responses from and directed towards the science teachers. Equally, when principals have their HSC performance review meetings with their own superiors, quite often it’s a tricky conversation regarding the ‘performance’ of the sciences.

What is the obsession with Band 6s? Band 6s sound elite, the very best. But the facts are that a Band 4 or 5 in a difficult subject such as Physics or Chemistry may make as big – or even bigger – contribution to ATAR (Australian Tertiary Admission Rank) (more on that later)  than a Band 6 in say, Music. Also, Band 6s are the only metric made publicly available and shared with the media.

Band 6s and exam results (raw and moderated, see Raw Results and Band 6s discussed later) from NESA are not to be confused (but are so often confused) with ATAR scores which are calculated by the Universities Admissions Centre (UAC). The ATAR is a rank, not a mark, indicating a student’s position relative to all the students in their age group in their state. Calculating the ATAR is based on an aggregate of ‘scaled’ marks of a student’s courses. This is totally different from ‘moderating’ by NESA in NSW (see later). Importantly (and often detrimentally) for teachers, only students are measured by ATARs; teachers are measured by Number of Band 6s.  So students in subjects that scale well such as Physics will receive a good ATAR contribution if they perform reasonably well in that subject. But ‘reasonably well’, say high Band 4 to Band 5, doesn’t cut it for teachers measured by their Number of Band 6s. It is also interesting to note that last year a Band 1 (a fail!) in Physics could rank higher than a Band 4 in Visual Arts and Band 3 in Legal Studies, Ancient History, Business Studies and PDHPE.

Last millenium in the UK, I was fortunate enough to achieve two thirds of my A-Level Physics class attaining ‘A’ grades in a non-selective Government school. In NSW, it took me 4 years just to achieve just one Band 6 in two non-selective schools. For the past few years in my work supporting schools in all sectors in the sciences, I regularly spend much of Term 1 advocating for science teachers, coordinators and principals who are feeling the heat for “poor Band 6 results”. I am constantly witnessing (and have suffered first-hand) the negative impacts of these judgements on teacher and principal morale and well-being. Student well-being is also being detrimentally impacted by similar unfair judgements and subject comparisons in Year 11 and HSC Trial (raw) exam results.

And what is the cause of all of this anxiety? The completely flawed metric of comparison that is the ‘Number of Band 6s’. So why is this overly blunt measure, that appears in all school marketing literature, on school billboards, in the Sydney Morning Herald, and is part of NSW vernacular, so flawed as a point of comparison? The answer is that, importantly, it was never intended to be a point of comparison in the first place, particularly between subjects.

Standards-based Assessment

The NSW HSC is a standards-based assessment. The whole construct of NSW HSC standards-based assessment was devised by Dr John Bennett, former Chief Executive of the Office of the Board of Studies NSW (now NESA), as part of his PhD thesis. Dr Bennett’s supervisor was Professor Jim Tognolini, now Director of the Centre for Educational Measurement and Assessment at The University of Sydney, who has been senior advisor on educational measurement issues for every state and territory education department and examination board including NESA. The whole premise of the standards-based model is to maintain the integrity and consistency of measuring standards year on year for an individual subject i.e. that a Band 6 in Chemistry in 2021 is comparable with a Band 6 in Chemistry in 2020; with no mandate or mechanism to say that Band 6 is comparable between subjects.

In this standards-based model, each subject had its own set of ‘Band Descriptors’ (now called Band Descriptions) providing descriptions of typical student performance for each of Bands 2-6. These respective Band Descriptors were devised by respective subject experts. They provide guidance for marking school based assessments (although this raises an issue discussed later), but most importantly, the Band Descriptors provide the standards for subject-specific ‘Judge Markers’ to measure student HSC examination responses against. This means that the Band Descriptors for say PDHPE were devised by experienced PDHPE teachers and judged annually by experienced PDHPE teachers. Similarly, the same can be said for Physics, or any subject. What this means is that the standards are different for each subject. If the Band Descriptors are different for each and every subject, and they are interpreted/judged differently in each subject, then we cannot use ‘Number of Band 6s’ by way of comparison between subjects. It stands to logic – it is simply comparing apples with oranges. 

Comparing Subjects

Following the release of the 2020 HSC results, in a quote in the SMH, Professor Tognolini reiterated

“we’ve never convinced the community that a band 6 in physics was not designed to be the same as a band 6 in biology or a band 6 in chemistry”

(This example is somewhat ironic since in the new science syllabuses every science subject has the same Band Descriptions, but the general point is being made by one of the original designers of the HSC itself). In the same article, Dr Timothy Wright, former Headmaster of Shore stated:

“it is really hard to get a Band 6 in say Chemistry and easier in say Business Studies”.

A few days prior, also in the SMH, a NESA spokesperson is quoted as saying:

“[the] number of Band 6s achieved in science courses can’t be compared with the number achieved in other courses”.

Consider the following examples comparing Band Descriptions between subjects:

Band 4

Business Studies

“demonstrates knowledge and some understanding of business functions and operations” 

Biology, Chemistry, Physics, Earth & Environmental Science, Investigating Science:

“demonstrates sound knowledge and understanding of scientific concepts”

How is “some understanding” comparable to “sound understanding”? It is not.

Band 6

Business Studies:

“demonstrates comprehensive knowledge and understanding of business  functions and operations”

Biology, Chemistry, Physics, Earth & Environmental Science, Investigating Science:

“demonstrates an extensive knowledge and understanding of scientific concepts, including complex and abstract ideas”

Again, there is much greater rigour in the supposed equivalent Band Description in the sciences compared to the non-science. 

Band 5

Perhaps most telling of all is where the Band Description for a typical Band 5 student in any of the 2 unit science subjects is:

“applies knowledge and information to unfamiliar situations…”.

Applying to ‘unfamiliar’ situations doesn’t appear in any other subject apart from Mathematics (and that is only for Band 6). If a student merely learns all of the content of a science, they cannot get above a Band 4 unless they can also apply their skills to unfamiliar situations. Whereas this is not the case in any other subject.

Equity of Access to Band 6s?

In 2017, I attempted to publish an article in The Conversation entitled Battle of the Bands: HSC Physics and Chemistry bottom of the Band 6 charts (co-authored with my PhD supervisor and co-supervisor). The article was refined, approved by editors and ready to go, only to be pulled at the 11th hour for external reasons. The article looked at data for 25% of the State to determine the rate of access to Band 6s among all HSC subjects in high schools in NSW. What was important about our analysis was that rather than compare blunt total numbers of Band 6s (which are readily available on the NESA website), we made a ‘common-cohort comparison’ i.e. what was the relative access to Band 6s of individual students in one subject when compared with themselves in the same other subjects? 

The findings were staggering. Students in Physics and Chemistry (in non-selective schools) were only only 26% and 27% as likely to achieve a Band 6 respectively as they were in the average of their other subjects. By way of comparison, students in PDHPE, Community & Family Studies and Society & Culture were twice (200%) as likely, and Music 1 and Design & Technology were two-and-a-half (250%) as likely to achieve a Band 6 than in the average of their other subjects. In extremis, this was a tenfold or one order of magnitude difference! That is hardly equitable access to Band 6s. Our findings confirmed what science teachers have been reporting for years, that even though the most able students often studied Physics and Chemistry, with relatively low numbers of Band 6s awarded for the State in total, combined with the over-representation of selective schools in these subjects, non-selective schools were left fighting over scraps in terms of access to Band 6s. Even in a school where say the performance of Physics is well above the State average, it is still destined to be below average compared to the other subjects in the same school (by definition, some subjects (usually about half) have to be below average when compared against each other in the same school, yet we still persist with this type of in-house comparison).

Gaming the System

So if you can’t compare Band 6s, and it is more difficult to get Band 6s in some subjects than others, yet schools are still being measured by their numbers of Band 6s, what can be done?

A genuine, yet morally wrong, short-term solution to maximise Band 6s is to guide students away from subjects with a low frequency of Band 6s. We know this happens already with subjects like English Standard: even though many students are better suited to English Standard, many schools push them into the higher Band 6 frequency English Advanced course. If this strategy is applied to the sciences, then schools simply stop offering the sciences. This is happening already in some quarters, not least with the compounding issue of the shortage of science teachers, let alone science-trained teachers. In the short term, this could genuinely address some of the shortfall of Band 6s in a school, but it is only a short-term solution. If a school stops offering any of the sciences, particularly the traditionally ‘rigorous’ ones such as Chemistry and Physics, then the school will ‘residualise’ as aspirational families reject such a school and attend elsewhere offering the full complement of sciences.

Raw Results and Band 6s

Further confusion and anxiety reign with many schools, students and parents misunderstanding raw exam results and any correlation with performance bands. In every HSC exam, a student’s raw exam mark is internally moderated by NESA by subject, based on the Judges’ interpretation of that year’s exam inline with that subject’s Band Descriptors. For example, in one particular subject, a raw HSC exam mark of 76% might be moderated up to a 90 i.e. Band 6, and a raw exam mark of only 18% might be moderated up to a 50 i.e. a Band 2. Another subject might have 93% moderated to 90 (Band 6) and 52% moderated to 50 (Band 2). However as mentioned, many people, particularly students, parents and sometimes school leadership, don’t understand this. They think that a raw exam mark directly and equally translates to a band i.e. that a raw exam mark of 90+ is needed for a Band 6 in any subject. Following the example above, a Band 6 performing student in the first subject, with a raw mark of 76 in their Y11 exams or Y12 Trials, may incorrectly think they are only operating at Band 4, and the adults around them may equally think so. A statistically more commonplace example might be a student only achieving a 46% raw mark in the first subject in a school exam and interpreting that as a fail, whereas the same score may scrape a Band 4 when moderated in the HSC. This misunderstanding can lead to undue anxiety, misplaced self-deprecation and self-efficacy, students dropping the wrong subjects, and yet again, flawed comparisons between subjects.

Where to from here with ‘Number of Band 6s’?

So comparing the ‘Number of Band 6s’ between subjects is completely untenable. Does this knowledge help principals? Right now, not in the slightest. They don’t need me telling them the pressure they are under for Band 6s. What we need is for all stakeholders to spell this out publicly and abide by no longer comparing subjects (and ideally schools) by the numbers of Band 6s. This has to start with media outlets who are responsible for publishing such league tables and contributing to this statewide obsession and very parochial NSW vernacular in the form of ‘Number of Band 6s’. Along with the media, we need NESA (not just anonymous spokespersons), all school sectors, principals’ associations, parent bodies, teacher associations and universities to formally declare and abide by not publicly publishing, advertising or comparing between subjects (and schools) using Number of Band 6s. Only by formally denigrating ‘Number of Band 6s’ can we get to a point where we have “convinced the community”. Discussing this with journalists associated with this blog, “it’s the only metric we have. We’ve asked many times for this to change. More diverse data would stop the league tables.”

Performance Measurement

However, schools and school systems still need to measure performance. Moving beyond Number of Band 6s should not be a problem. During these COVID times, we have finessed our metrics from blunt, not so useful ‘Number of Cases’ to more pertinent measurements such as ‘Cases in the Community’. With 75,000+ students annually sitting HSC exams, there is more than enough data to measure statistically significant ‘value add’ performance of every school, subject and even teacher (if class sizes are large enough). Mathematically, this is achieved by ‘multiple regression analysis’, controlling for all other variables such as gender, socioeconomic status, school type etc. (see an example of multiple regression analysis here). Such in-house data sets are already in use: in the Department of Education with Scout, in Catholic schools with the CSNSW HSC Analysis Project, and in independent schools with various analysts. In Victoria, there has been a long established effort to celebrate value-add through the Schools that Excel lists, though what is suggested above would be far more finessed. 

If the expectation is there for public comparison of all, the data is there if all schools are willing to share. But there’s the catch. With privacy of individual student information laws, and copyright of data, no detailed comparison as suggested above of the whole state i.e. all three sectors, can be published publicly unless everyone signs up to it, which is unlikely. But does that really matter? 

Publishable Performance Metrics

Stakeholders need to decide, are they willing to share all of their data so that true ‘value add’ measurements (with error margins) can be reported fairly? Or is everyone happy to have league tables of school rank only by subject, taking into account all bands (Number of Band 6s alone is no longer an option)? Or do we even need public ranking/comparison? The first option as mentioned is unlikely to be agreed to, plus would require a level of statistical numeracy across society that doesn’t exist, as evident in dealing with COVID. The second option is completely achievable, schools could be ranked within individual subjects. This would eradicate the current inaccurate comparison between subjects, but would continue to perpetuate the anxiety induced by comparison between schools. So ultimately, do we even need to publicly publish relative performance metrics at all if they are essentially meaningless and harmful, or can we just keep these in-house to help monitor progress and improve our individual education of students? Either way, we must stop comparing subjects using the Number of Band 6s.

Dr Simon Crook is director of CrookED Science, a STEM education consultancy, and Honorary Associate at the School of Physics, The University of Sydney. He works with primary and high school teachers and students around many aspects of science and STEM education, and assists the Sydney University Physics Education Research (SUPER) group with their work, including liaising with NESA regarding science syllabuses. His PhD research evaluated the impact of technology on student attainment in the sciences. Previously, Simon was a high school physics teacher for 15 years.

ATAR is a university marketing tool: 4 reasons to stop obsessing about it

The recent ‘revelation’ that Australian universities are not sticking to their advertised course cut-offs has caused a ruckus. Some even see it as a scandal: universities are admitting students with much lower (gasp) than advertised Australian Tertiary Admissions Ranks (ATARs), even into ‘top’ courses.

I think it is time to look at some facts around ATARs. I have four important ones for you. I believe everyone concerned about or discussing ATARs should know these facts.

Fact 1: Most university place offers are not made on the basis of the published ATAR.

Around two-thirds of the university places offered in Australia each year are made to students who do not have an ATAR. Almost 50 per cent of new university students are mature age, international, vocationally qualified or will have come to university through a myriad of alternative entry schemes.

Direct entry to university is growing exponentially at some universities, with the ATAR bypassed altogether. Direct entry, mature-age and international students, and students who come through VET pathways make up the majority of the Australian university cohort.

In my own state, Victoria, most courses that make offers to students through the Victorian Tertiary Admissions Centre (VTAC) do not publish ATARs for those courses. Yes, that’s right, most courses. Of the minority that do publish an ATAR for a course, two-thirds made more than 30 per cent of their offers to students with lower ATARs than the published figure.

All universities award ATAR bonus points. These extra points and how they are determined are not regulated in any way, nor are they usually transparent. Universities can award bonus points as they wish and for whatever they wish. This furtive awarding of points is disguised as recognising “leadership”, “community-mindedness” and other qualities of applicants.

Fact 2: The ATAR is not a score.

The ATAR is a numerical, relative ranking derived from senior high-school performance and a complex series of scaling and other adjustments. In a relative ranking system, students in one year’s cohort are ranked against each other.

An ATAR of 49 does not mean a student has failed, it means the student is ranked at the 49th percentile of a cohort that year in terms of their academic performance, as measured and scaled according to a complex series of mechanisms. In a cohort of, say, 45000 students in one year, a student with an ATAR of 49 has an academic performance equal to or better than 22000 students that same year. Hardly a failure.

And similarly, no matter how bright they are, nor how hard they or their teachers work, no more than ten per cent of students’ ATAR rankings will be in the top ten per cent of rankings. That’s how ranking works.

Fact 3: The ATAR is linked to socioeconomic status.

The evidence indicates that ATAR scores are correlated with socioeconomic status and social capital. To put it simply, the higher the socioeconomic status and capital of the student, the higher the ATAR is likely to be, and vice versa.

For example, poor people in rural areas generally have lower ATARs than rich people from metropolitan areas. But poor people are not stupid and do not compromise educational standards or outcomes. They just have less of the social and cultural capital that counts for school education outcomes (and, therefore, ATARs). No mater how tempting it is to think it: an ATAR rank is not a measure of intelligence, motivation, diligence, aptitude or ability.

Fact 4: The ATAR is now used primarily as a marketing tool to an under informed public

The ATAR was more important when the supply of university places was limited and demand for these exceeded supply. Cut-offs were a useful strategy for allocating too few places. However, in our current demand-driven system of university places, where there are few limits on the number of students a university can enrol, the ATAR is used primarily as a marketing tool. Universities rely on folk believing that the higher the ATAR, the better the quality of the course and possibly, the better the university. But what is it better at?

Many assume, understandably but incorrectly, that the higher the ATAR needed to get into a course of study, the “better” the quality of the course. But the ATAR has no correlation with objective measures of course quality. The simple truth is that the higher the ATAR for a course, the more popular the course is among school leavers.

The public are currently being misled by what is essentially a clever marketing system using ATARs as proxies of quality of courses and institutions. It needs to stop and Peter Shergold, the head of the federal Higher Education Standards Panel, has recently announced that the Panel will begin to increase transparency around this issue.

It is time to stop obsessing about entry standards and start focusing on exit standards

What we should be focused on as a society is what happens to students, regardless of their entry method, during their university study and after graduation. Many students who have very high ATARs come unstuck at university when the intensive support and guidance, to which they had become accustomed, falls away.

As Tim Pitman from the National Centre for Student Equity in Higher Education has recently emphasised, the point of university education is not to validate entry standards but to educate, value-add and ensure high quality outcome standards. We all know that elements of effective university education and high quality learning outcomes go far beyond the supposed standard at which the students enter the university. Teaching quality, the curriculum, learning support and student support are just some of the most obvious.

All universities must put in place proactive support structures, processes and programs to ensure all the students to whom they give access can meet their potential and have the highest chance of success possible.

I often ask: When a university graduate seeks employment, how many sensible employers will ask them to reveal their ATAR from all those years ago? On the other hand, how many will be interested in what the graduate knows, can do, and can contribute?

The main priority should be to focus on exit standards and outcomes, where students end up, not where they started. If we restrict access to university only to those guaranteed to succeed based on previous education scores, we block a life-changing opportunity for scores of thousands of people every year.

It’s important to keep educating a wide range of students

University education is now open to more students than in the past when it was just available to white, upper-class men. This is good for students, their futures, their families, the economy and society. Successive governments of both sides have encouraged and supported increased access to university education for a larger number and broader range of people. The alternative is to have fewer people educated at the highest levels and subsequent reduced capacity to lead and innovate in a rapidly changing world.

Case studies at my own universities show that despite starting with very low ATARs, those who go on to successfully complete courses will graduate as qualified professionals and subsequently contribute to the economy, their communities and society in enhanced ways.

What matters most about university education is the quality of the education offered and the capacity and knowledge of graduates and whether they can do what governments and society expect of them, having had the privilege of access to education at that level.

If the purpose of university education is to contribute to an educated society, that treats its members and members of other societies with dignity, respect and kindness, while simultaneously advancing economic, environmental and other fronts, then we should unburden ourselves of outdated and inaccurate notions about the power of a single number.

I believe we need to focus more closely on how to facilitate success for the many, rather than the few.

 

Devlin

 

Professor Marcia Devlin is a Professor of Learning Enhancement and Deputy Vice-Chancellor (Learning and Quality) at Federation University Australia.  @MarciaDevlin