This post was written before Alan Tudge took leave from his position as the Minister for Education. But he’s not the only one to bang on about ‘evidence’ without really being clear what he means.
There can be little argument in wanting university Schools of Education to impart to their students, knowledge premised on systematically-acquired evidence. It is irrefutable that teacher educators want their students to leave university and enter the classroom confident in the delivery of best practices. However, the requirement for ‘evidence based-practice’ is in danger of becoming a political polemic in which knowledge may be obfuscated by ideology, rather than being the outcome of systematic investigation.
Writing in The Australian,Paul Kelly ‘reflects’ on the then Federal Education Minister, Alan Tudge’s ‘drive’ to ensure universities impart, ‘…evidence-based practices such as phonics and explicit teaching instruction methodologies.’ The former Minister issues a warning that he will use, ‘the full leverage’ of the Federal Government’s $760m budget to insist, ‘…evidence-based practices are taught…’ in universities. Yet, the threat is based more on assumption that evidence-based practices are not being taught in our universities, than any substantial evidence that they are not.
It is ironic the former Minister should argue for something on the basis of a lack of evidence. Aside from this point, questions arise around the nature of evidence the former Minister considers to be bona fide in relation to practice. This is an issue around which there is a distinct lack of clarity. The former Minister clearly states what he does not want, which includes: sociology and socio-cultural theory. His wish to see the demise of branches of thinking are questionable, given that it is usually dictatorial regimes that close down thought in their nation’s academies. He wants a tightly prescriptive curriculum for teacher education. In this respect, he appears to be following the Conservative administration of Boris Johnson in Britain, where a similar proposal has been tabled for English universities, resulting in some of the world’s top universities describing the plan as deeply flawed and having damaging consequences If Boris Johnson wants something and Oxford and Cambridge consider it fool-hardy, the weight of opinion surely tilts in favour the academies.
The point remains as to the kind of ‘evidence’ upon which evidence-based practice is premised. What may pass as ‘evidence’ is not necessarily bona fide ‘knowledge’. All research, including educational research, involves knowledge that is acquired by means of rigorous, systematic investigation within clearly defined parameters. Even so, the outcomes of an investigation may be influenced by a number of factors, including: ontological perspective; the framing of the research questions; methodological approaches; analytical methods; researcher interpretation and the degree to which any funding body remains impartial. Ultimately, before it can take its place in the pantheon of evidence, research must be interrogated by means of independent peer-review and subsequently published in a highly respected discipline relevant journal. Even then, sometimes what may appear to be good evidence can prove to be disastrous in its outcomes. We do not know if the ‘evidence’ to which the former Minister refers, satisfies these requirements. What is certain is that the ‘evidence’ used by Paul Kelly to suggest universities are ‘failing’ their students and the nation’s schools, does not meet most of these standards of respected research.
It was an Australian doctor, William McBride, who in 1961, published a letter in The Lancet, suggesting that thalidomide had negative consequences and drew attention to the possible fallacy of evidence. Randomised control trials (RCTs) of the drug in rats had proven effective for controlling for morning sickness, but it took observation of multiple cases to prove the drug was not fit for purpose.
So, what kind of ‘evidence’ is being referred to by the former Minister when he rightly insists we need to ensure that pedagogy is evidence-based’. Is he referring to evidence derived from primary research, such as randomized control trials (RCTs) and observational studies; or secondary research, including systematic reviews of the research literature? The fact is there is no single type of evidence. It is generally recognised that different evidence types have different methodological strengths. At the pinnacle of the ‘hierarchy of evidence’, are systematic reviews, followed by RCTs, cohort studies and then case-controlled studies, case reports and finally expert opinion. Without identifying the type of evidence to which he refers, the former Minister, appears to resort to lay-opinion disguised as evidence.
Without a clarity of thought, political policy, based on vague supposition, could lead to prescriptive measures that result in ‘damaging consequences’. As the thalidomide example cited above demonstrates, a single type of evidence is not always sufficient proof, and multiple types of evidence may be necessary to triangulate knowledge. Rather than denouncing certain disciplines of thought and prescribing others, perhaps the way forward is to systematically interrogate different types of evidence in order to evaluate their efficacy, as bona fide knowledge. The best way to do this is by means of teacher-academics and teacher-practitioners working collaboratively, across multiple settings, engaging in systematic research, and cross-referencing results. For this to happen, there needs to be a commitment by government to fund, not cut, educational research. Australia has some of the finest Schools of Education in the world; they are staffed by dedicated academics who want the best for their students and the best for the nation’s school children. What universities need is a knowledge-rich government, not political polemic that does not even reach the baseline of the ‘hierarchy of evidence’.
Paul Gardner is a Senior Lecturer in the School of Education at Curtin University. He is also the United Kingdom Literacy Association’s Country Ambassador to Australia. Paul specialises in research around writer identity; compositional processes and pedagogy. He has written about process drama, EAL/D, multicultural education and collaborative learning. His work is premised upon inclusion and social justice. Twitter @Paugardner
It’s all Humpty Dumpty…it means whatever he says it means. This recent post of danah boyd maps it well: https://zephoria.medium.com/statistical-imaginaries-an-ode-to-responsible-data-science-ac6ffcd8f246
Hi Chris,
Thank you for the link to Danah Boyd’s speech. I think the take on ‘uncertainty’ and political sensitivities is a useful idea, as well as agnotology, which I can now include in my lexicon. Best wishes, Paul
In a political context “evidence-based” means “anything, including anecdotes, which supports our position”. What universities need are academics who understand this reality, and are ready to respond in a timely, politically relevant way. I was elected a Fellow of the Australian Computer Society for doing this to help formulate the public Internet policy for Australia. It was not a neat and tidy process, being described as a “cabal”. http://www.austlii.edu.au/au/journals/CLCCommsUpd/1995/56.pdf
Hi Tom,
I agree, it is not enough for academics to do research and be published. Research linked to praxis for social justice must be at the heart of what we do. But, that positioning will always be subject to Establisment criticism, and we need to be prepared for that too.
If Ed research is so conclusive, then why do the major ED research organisations differ and contradict each other? E.G., Hattie’s summaries dominate Ed Dept in Australia (the Vic 10HITs and NSW =”what works best”). But Hattie’s top strategies, from which he claims 3+ years gain – Collective Teacher Efficacy, Self Report Grades and Piagetian Programs, are not even mentioned in the USA’s largest evidence base the What work Clearing House or the largest in UK – the Education Endowment Foundation. How can this be??? In Vic we’ve been using Hattie’s top strategies for 5 years, 3+ years growth in that time, would put us the top of PISA rankings (if they matter to you) but they have not. Other biases are at play here, In OZ at least, Scott Eacott (2017) in School Leadership and the cult of the guru: the neo-Taylorism of Hattie, provides a simple answer,
“Hattie’s work has provided school leaders with data that APPEAL to their administrative pursuits.” If you want to go down the rabbit hole start here – https://visablelearning.blogspot.com/p/other-researchers.html
Hi George,
One of the problems with meta-analysis, and this may be the case with Hattie’s work, as I understand it, is that it is not necessarily nuanced in its differentiation of studies; their research questions, methods, sample sizes etc. Its a bit like presenting us with a fruit bowl and saying all the fruit is the same.
thanks Paul, yes that’s definitely true of Hattie’s work. I like Terry Prof Terry Wrigley (2015) in Bullying by Numbers, gives an English humorist critique of Hattie’s & the EEF’s method,
“Its method is based on stirring together hundreds of meta-analyses reporting on many thousands of pieces of research to measure the effectiveness of interventions.
This is like claiming that a hammer is the best way to crack a nut, but without distinguishing between coconuts and peanuts, or saying whether the experiment used a sledgehammer or the inflatable plastic one that you won at the fair”
Totally agree, George.
Evidence – based
What a lot of hooey about evidence-based. Being an author of teacher’s books, a Literacy Consultant in South Australia, London, New York/ Jersey City and a sessional tutor Flinders University for 12 years, I have met many children and education people and all have different ways of gathering knowledge and acting on that knowledge.
There can be no evidence-based with such an array of beliefs, individual patterns of acquiring and processing learning. Some children like to have models shown to them and consequently some teachers believe and are inclined towards supportive pedagogy. Some children prefer launching out and finding out for themselves and teachers respond by setting challenges. Then there are the in-betweeners.
To have evidence based, that politicians propose will inevitably be one-size-fits all assessment (for example, Naplan, PISA which does not seem to have improved learning outcomes?). Teachers vary their type of assessment, for example, formative where they work with individual children from their current level of development and build on that foundation. Summative Assessment is achieved when children complete a learning period and the teacher knows what the assessment should entail. The evidence is, ‘this is what I have taught’ and the criteria for achievement is ‘this is what the children should know’.
Politicians must leave it to the qualified teacher to establish their own evidence-based progression; child centred approaches depending on teacher’s data and in the moment analysis of student’s needs… This happens when experienced teachers [are] trusted to make their own instructional decisions. (Rachel Gabriel’s, May 2021, in the journal, From Research to Practice).
L:iz, if you want a politician to pay attention, then the evidence-base you have to cite are voters, such as with: “Minister, if you mess our teaching, the evidence says that the parents of our students will not vote for your party”. 😉
I totally agree with you Liz. We need more research involving teachers and educational researchers working collaboratively. The outcome
, as you imply, is likely to be a pluralistic (not one size fits all) repertoire of methods and strategies. The problem is the Governments want simple messages and cost-reduced policies to sell to the electorate. Superimposed on this perspective is an unequally funded education system and pre0c0nceived ideas of what students from different backgrounds are capable of achieving.
Agreed, George.