synthetic phonics

Teaching of synthetic phonics in Australia based on flawed evidence

What is phonics for? Where does it fit into an overall pedagogy of literacy? Without clear answers to these questions, the contestants in the phonics debate will continue to circle each other like blindfolded prizefighters.

The aim of literacy teaching is to produce readers who tackle texts on paper or screen with confidence and understanding, so that they can learn, enjoy their reading and, when appropriate, read aloud with fluency and expression. But to beginners the marks on the page are arbitrary, meaningless squiggles. Even those which correspond to words they understand when spoken to them cannot yet be related to meaning. Therefore the overriding aim of phonics is the efficient identification of unfamiliar printed words.

So who needs to be taught phonics and when?

Some children are enabled to bridge that gulf by being read to copiously, and joining in the reciting of the texts they have heard so often they have them off by heart, until they twig the essential insight that what they are saying is represented by what they can see. For them, phonics is not only unnecessary, but may be a hindrance. Therefore phonics has no place in the teaching of reading to young fluent readers, and testing their ‘phonic knowledge’ is irrelevant and risks causing them to regress in their learning.

What of the children who arrive at school not yet reading? Their most important immediate task is to learn to read, so for them the purpose of phonics is to provide a quick start on the identification of regularly-spelt words, alongside the essential (for English, with its complex orthography) learning of some basic high-frequency but irregularly-spelt words as sight words.

The experimental evidence shows clearly that phonics in this context works for both normally-developing children and those who are falling behind. But the same body of evidence also shows that (a) the teaching must be systematic and not incidental; (b) it must be embedded in a broad and rich language and literacy curriculum, because there is much more to reading than just word identification, and therefore phonics alone does not constitute teaching children to read.

The flawed case for teaching only synthetic phonics

The message about embedding phonics in a rich curriculum is there loud and clear in the Rose Report (2006), which was published in England and used as evidence to impose a national phonics test in England. Advocates of the phonics test, and the associated teaching of “synthetic phonics”, in Australia regularly cite this report, especially to argue that Australia should impose the teaching of synthetic phonics on all school beginners.

I was present during the presentation of evidence for the Rose enquiry and I believe Jim Rose overstated his case for synthetic phonics in the subsequent report. Nevertheless, Rose’s message about embedding phonics in a rich curriculum, which is a very basic message, got lost in the controversy his report stirred up around whole-word versus phonics.

Jim Rose mostly stuck to saying phonics teaching must be systematic, but in places elided that into saying that systematic phonics is synthetic phonics, which the experimental evidence did not justify, and still doesn’t. There is as yet no evidence that any one form of phonics teaching produces better progress than any other form of phonics.

Following the Rose Report there was a noticeable increase in the number of phonics-based intervention schemes for struggling readers in England, but it was only after the change of government in 2010 that strong official pressure was put behind synthetic phonics, often using a flawed and partial interpretation of the research evidence, and Rose’s conflation of systematic phonics with synthetic phonics. This was expressed in the misleading slogan-like mantra that ‘Synthetic phonics is the best way to teach children to read’, ignoring all the caveats about embedding phonics in the broader curriculum and the dearth of supporting evidence.

It is ironic that that line is actually at odds with the latest (2013, p.13) version of the national curriculum for English in England, which has this to say:

Skilled word reading involves both the speedy working out of the pronunciation of unfamiliar printed words (decoding) and the speedy recognition of familiar printed words. Underpinning both is the understanding that the letters on the page represent the sounds in spoken words. This is why phonics should be emphasised in the early teaching of reading to beginners (i.e. unskilled readers) when they start school.

This is much more balanced than many public pronouncements. Moreover, it implies that phonics teaching is essentially time-limited. As soon as children ‘get it’ or are seen to have ‘cracked the code’, only occasional reinforcement of sounding-out and blending for unfamiliar words is needed. As Jeanne Chall put it 50 years ago in Learning to Read: the great debate, once children have developed the ability to identify written words, teaching further phonics ‘is sheer madness’.

Teachers don’t need a national test to tell them about their own students

What of those children who don’t ‘get it’ the first or second time? There are a few for whom phonics simply doesn’t work, but they are rare and exceptional. An Australian friend who taught in an Infants school (Years 1-2) in England for over 30 years says that she was unable to unlock the door of initial literacy for just one child in all that time. There are others who struggle and fail to progress well. Observant teachers know perfectly well who they are, and need deep professional knowledge to understand them and work round their difficulties. Teachers don’t need a test to identify children who are struggling. When teachers in England were asked about the phonics test a great many said it didn’t tell them anything they didn’t already know.

What teachers of initial literacy do need is better support for helping the strugglers, which was supposed to be part of the follow-up to the phonics test, but is notable by its absence. Money put into that would be well spent, which the money spent on the phonics test is not.

In the first three years of national operation, the phonics test in England cost £44,000,000 – what a waste! Spend your Australian dollars on good professional development instead!


Greg Brooks is Emeritus Professor of Education at the University of Sheffield, UK. He was a member of the Rose Committee (2005-06). Greg was the chairperson of Federation of European Literacy Associations (2013-16) and has researched and written widely on the initial teaching of reading and spelling, especially through phonics. Find more about Greg Brooks here.

Greg is a contributor to the book Reading the Evidence: Synthetic Phonics and Literacy Learning edited by distinguished researcher Margaret Clark (OBE) that will be launched at the AARE 2017 conference in Canberra on Wednesday 29th November. Other contributors to the book are Misty Adoniou, Terry Wrigley, and Henrietta Dombey.

 AARE2017 Conference

The theme of the 2017 AARE conference is ‘Education: What’s politics got to do with it?’ There will be over 600 presentations of current educational research and panel sessions at the conference over the next five days. Journalists who want to attend or arrange interviews please contact Anna Sullivan, Communications Manager of AARE,   Follow the conference on Twitter #AARE2017

Follow this blog as conference goers blog about their presentations through the week.

*NOTE to readers and bloggers. Our Facebook and LinkedIn shares are not showing in some browsers but are showing in Safari. So check us out in Safari. Our tech people are on the case.

How the national phonics test is failing England and why it will fail Australia too

A national test of phonics skills will not improve faltering literacy standards in Australia. The test is being imported from England where it has been in place since 2011. It has failed to improve national standards in reading in England. Instead the phonics frenzy of testing and practicing nonsense words that has accompanied the implementation of the test appears to be narrowing classroom practice and damaging literacy standards.

The test itself is ill conceived and poorly structured. Should we wish to test the phonological awareness of our six year olds this test would be inadequate.

So how did we end up even considering the test for Australian children? The process that led to this test being recommended for all Australian six year olds was deeply flawed and is an unfortunate example of the growing influence of ultra-conservative think tanks on educational policy.

What is the phonics screening check?

The phonics screening check is a test devised in England. It is conducted one on one with Year 1 students (typically aged 6). The children are presented with 40 decodable words. Twenty are pseudo words that are indicated as such by an accompanying alien icon. The other 20 are real words, but ideally unknown to the students.

The rationale is that this is a test of pure phonic knowledge, not vocabulary or sight word knowledge. Students need to score 32 from 40 to pass the Check. Those who don’t pass are given intervention using a government mandated synthetic phonics program.

Why was the check recommended and who was involved?

Jennifer Buckingham from the conservative think tank, the Centre for Independent Studies, was appointed to chair the panel that was tasked with conducting an independent review of the need for Year 1 Literacy and Numeracy checks.

Dr Buckingham was a public advocate for England’s Phonic Screening Check before she was appointed to head the review and write the report. And she continued to publicly advocate for the Check whilst conducting the review, and before the review’s final report was released. So the report’s findings were not surprising.

What was surprising was the report’s lack of reference to any of the peer reviewed research studies that have been conducted on the Phonics Screening Check since its introduction in England.

A review of that research finds little value in the Phonics Screening Check.

The phonics check is not helping England; in fact England is going backwards

The Check is not improving reading comprehension scores in England. This year’s literacy test results are disturbing.

Scores on the phonics screening check do not correlate with scores in reading comprehension tests as measured by England’s national SAT reading tests in Year 2 and Year 6.

In 2016, 91% of Year 1 students passed the Phonics Screening Check. This was lauded as evidence the Check was working because it was forcing teachers to focus on phonics, and therefore students were passing the Check at higher rates than ever before.

In 2017 these ‘successful’ phonics-ready students sat their Year 2 Key Stage 1 reading comprehension test. To pass this reading comprehension test, children only had to score 25 from 40 questions. However, only 76% passed. And only 61% of low SES students passed the test.

It appears then that being poor has more to do with your reading comprehension achievement than knowing your sounds.

It also seems the phonics check hasn’t solved the gender puzzle in reading achievement, as girls consistently outperform boys on both the phonics check (by 7 percentage points in 2017) and the reading comprehension tests (by 9 percentage points in 2017).

Again in 2017, Year 6 children sat the Key Stage 2 Reading comprehension test. These are children who sat the Phonics Screening Check in 2011. Those who didn’t pass were placed in synthetic phonics programs mandated by the English Department of Education, until they passed the Check. Yet, this year, only 71% reached the minimum benchmark in their Year 6 reading comprehension test.

Thus, in 2017, more than 1 in 4 English children in Year 6 are not able to read with basic comprehension. The phonics inoculation they were given in their early years patently hasn’t worked, and there is trouble ahead as they move into high school. England should feel very nervous about the next round of PISA results.

The test fails to deliver on any of its claims

Buckingham’s report to the Minister describes the check as a ‘light-touch’ assessment. The research indicates that this is a problematic claim on two counts. It is too ‘light’ to identify and diagnose reading difficulties, but its prominence as a mandatory standardised assessment means its influence on literacy instruction has not been ‘light’.

As a short assessment, it assesses a limited range of phoneme/grapheme relationships, which limits its use as a phonics check. Recent research in England, which pointed this out, goes on to question the purpose and validity of the check.

As a partial assessment of only one reading skill it cannot give a diagnosis of a reading difficulty, and it can offer no direction for subsequent interventions.

Indeed the check has been found to be no more accurate than a teacher’s judgement in identifying struggling readers.

In short, the check doesn’t tell teachers anything they didn’t know already. And it doesn’t tell them what kind of instructional intervention their identified strugglers need.

Heavy-handed effect in England

The phonics screening check has had a very heavy-handed effect on literacy instruction in England. The UK Literacy Association claims it has failed a generation of able readers in the UK.

Students who don’t pass the check are required to re-sit the test after yearlong participation in the government mandated synthetic phonics program. These programs relentlessly drill the children in out-of-context phonic decoding to prepare them to read the unknown or alien words in the check. The deliberate focus on these non-meaningful words has shifted the focus of literacy instruction away from meaning, despite the fact that evidence suggests that the ability to read pseudo words is not a good predictor of later reading comprehension

England now has the farcical situation where literacy time is spent teaching struggling Year 1 and Year 2 readers to decode pseudo words to pass a test.

As a consequence of the over emphasis on synthetic phonic decoding skills, other reading skills have been sidelined. The very purpose of reading, comprehension, has dropped off the instructional agenda as schools focus on ensuring their students pass the phonics screening check.

Flawed reasoning behind recommending the test 

The report provided to the Minister by the panel headed by Buckingham claims the check is required because early reading assessments currently used in every state and territory in Australia are inadequate. The report provides a table of ‘necessary’ components of a phonics check, although it is not made clear what research has been drawn upon to come up with those components.

Notwithstanding this limitation, analysis of the English Phonics Screening Check shows it does not even meet these, the panel’s own requirements for a valid phonics check. Indeed the existing Northern Territory Foundations of Early Literacy Assessment (FELA) meets more of the panel’s criteria than the proposed Check does.

The Check contains both real and pseudo words. The real words are ideally not in the children’s existing vocabulary. The rationale for the inclusion of pseudo and unfamiliar real words is to ensure the children are relying solely on their phonic knowledge rather than prior familiarity with the word. Thus the check is supposed to be a pure assessment of phonic knowledge.

It does not do that. The test itself is flawed.

Detailed analysis of the flaws in the test

An analysis of 10 of the 40 words in the 2017 English Phonics Screening Check is provided below. The analysis confirms research findings that the Check is neither a pure test of phonic knowledge, nor an accurate assessment of phonic skills.

Scoring real word decoding in the 2017 Check

To achieve a correct answer on the 20 unfamiliar real words in the check, a student must correctly read the ‘real’ word, and not use any other plausible phonic decoding for that word. This makes the Check a vocabulary test rather than a phonics test.

For example, ‘groups’ must be read so the ‘ou’ is pronounced as /oo/. If the children decode this word with the ‘ou’ pronounced as /ow/ as in ‘house’, or /u/ as in ‘tough’, they are marked wrong.

As such, the child is marked on their existing knowledge of the word and its pronunciation. The children who used other accurate phonic possibilities for the letters ‘ou’ are marked incorrect, and we are left with inaccurate information about their phonic knowledge.

Similarly ‘chum’ must be read with the ‘ch’ pronounced as /ch/ in chip, not /k/ as in Chris or /sh/ as in chef.

‘Blot’ must be decoded to rhyme with ’hot’. If the ‘o’ is pronounced as the ‘o’ in ‘so’ or ‘go’ the student is marked wrong.

The ‘oa’ in ‘goal’ must be pronounced to rhyme with ‘foal’. If the student breaks the word into go – al, using the pattern found in ‘boa’, they are marked as wrong.

These examples show the children are being marked on their vocabulary knowledge, not their ability to use phonic knowledge. They are being marked wrong, despite plausible phonic decoding, and as such we have not gathered accurate information about their phonic strengths and weaknesses.

Scoring pseudo word decoding in the 2017 Check

To achieve a correct score for the pseudo words, the students must decode the word using only the phonemes identified in the marking guidelines.

For example, the pseudo word ‘braits’ is only marked correct when the ‘ai’ is pronounced as /ay/ as in ‘rains’. If the child decodes the word using the /a/ in ‘plaits’, or the /e/ in ‘said’, they are marked incorrect.

Given ‘braits’ is not a real word it is unclear why only one phonological interpretation is allowable. And it is unclear what we have learned about the child’s phonological skills, given they were marked wrong when their decoding was correct.

The pseudo word ‘zued’ is only marked correct if the ‘ue’ is pronounced as /oo/ as in ’too’, ‘to’ or ‘two’. If the students use any other pronunciation of ‘ue’ as heard in ‘duet’, ‘cruel’, ‘suede’ or ‘cue’ they are marked incorrect.

The ‘ue’ pattern is assessed yet again in the pseudo word ‘splue’. Once again the only decoding effort marked as correct is the /oo/ as in ’too’, ‘to’ or ‘two’.

The ‘ue’ digraph is being tested twice in 40 words, and with only the one pronunciation marked as correct. It leaves unanswered how the ‘ue’ in ‘cue’, ‘league’, ‘duet’, ‘cruel’, and ‘suede’ might be assessed.

‘Tay’ is designated a pseudo word in the 2017 test, which I’m sure the Scots would be surprised to hear given it is the name of Scotland’s longest river. Another reason for Scottish independence perhaps?

To score correctly on this word the students must rhyme it with ‘pay’.

However it turns out ‘tay’ is also a real word in the vocabulary of a Turkish 6 year old. It is the Turkish word for a baby horse – pronounced like the English word ‘tie’ to rhyme with ‘aye’.

It is also a real word in the vocabulary of a Vietnamese 6 year old. It is the Vietnamese word for hand – also pronounced like the English word ‘tie’ to rhyme with ‘aye’.

What information has been gained, or missed, about these children’s linguistic competence by marking their decoding as incorrect?

Scoring two syllable words in the 2017 Check

There are 36 one syllable decodable words in the Check, and 4 two-syllable words. The two syllable words are particularly problematic when using a synthetic left to right decoding method, which is the theoretical basis of the Check, and the accompanying mandated instructional interventions.

‘Model’ was one of those four two-syllable words in 2017.

If the word is decoded left to right using synthetic decoding processes we are likely to read the word as ‘mo’ to rhyme with ‘so’ and ‘del’ to rhyme with ‘hell’. As a consequence we end up with a word that sounds like the way we pronounce the word ‘modal’. If a child decodes the word in this manner, they will be marked wrong in the check.

It is necessary to have the word ‘model’ in your vocabulary to pronounce it correctly, to know which syllable takes the emphasis, and to know that that the second vowel is reduced to a schwa sound.

‘Reptiles’ is also on the 2017 test. Using the left to right approach taught in synthetic phonics programmes this word can be plausibly broken up as follows: rep – til – es. This would be marked wrong in the Check. Marking such an attempt as incorrect would fail to take account of the phonic knowledge the student has. Consider, in contrast, if the word had been ‘similes’. If the child had broken the word into si – miles, they would have been marked as incorrect The only way a child would know to break the words into rep – tiles, or sim -il -es, is if they have the word already in their vocabulary.

So the check is failing in even what it is purporting to do, that is measure phonological processing.

England should pull the plug on this test

The Phonics Check has failed to deliver the desired improvements in reading comprehension in England. It was worth a shot, but it is time to pull the plug.

It has failed because it attends to only one early reading skill, and thus distorts reading instruction in the early years to the detriment of reading comprehension in the later years.

It has failed because the Check is faulty, and ill constructed. It is unable to successfully assess the one skill it seeks to assess, phonological processing, and as such cannot even provide accurate diagnostic information to teachers.

Facing our literacy challenges in Australia

Australia can avoid falling into the same trap. Like England, we clearly have literacy challenges in the upper years of primary and secondary school. Our NAPLAN results for Year 7 and 9 make this very evident. But these are not challenges with the basic skills of phonological decoding of simple words and nonsense stories of Pip and Nip. These are challenges with depth of vocabulary and the capacity to deal with the complex syntactic structures of written texts across the disciplines.

It is crucial the State and Territory Ministers of Education are not distracted from these real challenges by placing false hope in a Phonics Screening Check. It is time to dump the idea.


Misty Adoniou PhD is Associate Professor in Language, Literacy and TESL at the University of Canberra. She was a primary school teacher for 10 years before moving to Greece and teaching and consulting in the area of English Language Teaching for 7 years. She has received numerous Teaching Awards including the Vice-Chancellor’s Award for Teaching Excellence, and was the Lead Writer of the Federal Government’s Teachers’ Resource for English Additional Language/Dialect (EAL/D) learners. She sits on a number of national and international advisory boards as a literacy expert.


Misty Adoniou with UK Professors Greg Brooks (member of the Rose Report Panel), Terry Wrigley, and Henrietta Dombey have contributed to a book edited by distinguished researcher Margaret Clark (OBE) published this month outlining how England came to adopt the Phonics Screening Check and providing more detail of its impact on teaching and learning as well as its costs.

Margaret Clark (Ed) (2017) Reading the Evidence : Synthetic Phonics and Literacy Learning. Glendale Education, Birmingham available on Amazon