To understand AI today, we need both why and how

We know AI is such a big deal that just this week the President of the United States, Joe Biden, signed an executive order to try to address the risks of a technology he described as “the most consequential technology of our time”.

So it is no wonder that the proliferation of both AI tools and of conferences during 2023 continues unabated.

And how seriously are we taking the challenge of AI in Australia? Our focus is disproportionately focused on “how”, while larger questions of “why” seem opaque. 

Now is a good time to reflect on where we are with AI. We might now have much greater capacity to generate data, but whether this is leading to knowledge, let alone wisdom, is up for serious debate.

A time to reflect

The number of AI tools and their applications to education is overwhelming, and certainly way beyond initial angst about ChatGPT and cheating that set the tone for the start of the 2023 academic year. 

But, as Maslow once wisely mused, only having a hammer makes us see every problem as a nail. If we have these powerful technologies, knowing how to use them can’t be the only issue. We need to talk more about why and when we use them. This goes to the heart of what we hold as the purposes of education. 

The case of the smartphone provides a useful comparison. First launched in 1992, it took until 2007 for the iPhone to disrupt the technology conversation. Some dreamed of, and seized, the opportunities in education such a device enabled. Others exercised caution, waiting to follow the early adopters only once the path was cleared.

UNESCO advice

Sixteen years later, though, responses have sharpened. UNESCO recently advised that smartphones should only be used where they benefit learning, advice that admittedly seems self-evident. It has taken so long for such a statement to emerge, though, it suggests the “tool” is having ongoing impacts well beyond learning. Sadly, too many examples from schools attest to the harnessing of smartphone power for abusive and manipulative purposes, particularly with sexual violence. The rise of AI has only exacerbated some of these concerns.

The potent combination of learning disengagement and social dysfunction continues to create challenges for how technology is used in schools. There is a rising chorus in support of more handwriting. Some jurisdictions have moved to wholesale banning of mobile phones at school

How we’ve dealt with smartphones should give us pause for reflection, particularly when some early warning signs about AI are clearly evident. 

When AI whistleblower, Timnit Gebru, first started in AI research, she lamented the lack of cultural and gender diversity amongst developers. Things have improved, no doubt, but cultural and social bias remain significant problems to be addressed.

Flat-footed prose

Much lauded creative possibilities of generative AI are still needing development, and also come with serious ethical questions. Margaret Atwood recently lamented the lack of creative artistry of outputs based on her own works, concluding that its “flat-footed prose was the opposite of effective storytelling”. 

Worse, she argued, was that the texts used to train these models were not even purchased by the company, instead relying on versions scraped – stolen – from the internet. That, in turn, meant any royalty payments she might otherwise have earned were withheld. Australian authors have similarly expressed their frustration. Eking out an existence as an author is challenging enough without pirated works further stealing from these vital cultural voices.

We seem to have a larger challenge, too, buried deep in little discussed PISA data. Much of the focus on PISA is about test results.

Sobering results

But here’s what is in Volume III : students’ perceptions about bigger existential questions on the meaning of life, purpose, and satisfaction. The results, all of which are below the OECD average, are sobering:

  • 37% of students disagreed or strongly disagreed that “my life has meaning and purpose”;
  • 42% of students disagreed or strongly disagreed that “I have discovered a satisfactory meaning in life”;
  • 36% of students disagreed or strongly disagreed that “I have a clear sense of what gives meaning to my life”.

And this data was collected before the traumas of Black Summer in 2019 and COVID-19. There is much anticipation about what story the more recent round of PISA data collection will tell.

Based on this data, we clearly have much more work to do on our second national educational goal to develop confident and creative individuals who “have a sense of self-worth, self-awareness and personal identity that enables them to manage their emotional, mental, cultural, spiritual and physical wellbeing”. 

What can AI do in pursuit of these goals?

Much of the conversation about AI has been focused on the first part of the first national educational goal – excellence. How can AI be used to improve student learning? How can AI reshape teaching and assessment? More remains to be done on how AI can address the second part – equity.

These concerns are echoed by UNESCO in its recent Global Education Monitoring Report. The opportunities afforded by AI raise new questions about what it means to be educated. Technology is the tool, not the goal, argues the report. AI is to be in the service of developing “learners’ responsibility, empathy, moral compass, creativity and collaboration”.

AI will no doubt bring new possibilities and efficiencies into education, and to that end should be embraced. At the same time, a better test for its value might be that posed recently by Gert Biesta, that we must not:

lose sight of the fact that children and young people are human beings who face the challenge of living their own life, and of trying to live it well.

Attraction to the new, the shiny, the ephemeral, the how, is to be tempered by more fundamental questions of why. Keeping this central to the conversation might prevent us from realising Arendt’s prophecy that our age may exhibit “the deadliest, most sterile passivity history has ever known”.

Dr Paul Kidson is a senior lecturer in Educational Leadership at the Australian Catholic University. Prior to becoming an academic in 2017, he was a school principal for over 11 years. His teaching and research explore how systems and policies govern the work of school leaders, as well as how school leaders develop and sustain their personal leadership story. He previously wrote about artificial intelligence for EduResearch Matters with Sarah Jefferson and Leon Furze here.

Artificial Intelligence (AI) in schools: are you ready for it? Let’s talk

Interest in the use of Artificial Intelligence (AI) in schools is growing. More educators are participating in important conversations about it as understanding develops around how AI will impact the work of teachers and schools.

In this post I want to add to the conversation by raising some issues and putting forward some questions that I believe are critical. To begin I want to suggest a definition of the term ‘Artificial Intelligence’ or AI as it is commonly known.

What do we mean by ‘Artificial Intelligence’?

Definitions are tricky because the field is so interdisciplinary, that is it relates to many different branches of knowledge including computer science, education, game design and psychology, just to name a few.

I like the definition offered by Swedish-American physicist and cosmologist Max Tegmark. He describes Artificial Intelligence systems as being ‘narrowly intelligent because while they are able to accomplish complex goals, each AI system is only able to accomplish goals that are very specific.’

I like this definition because it mentions how complex AI can be but makes us focus on the reality that AI is narrowly focused to fulfill specific goals.

We already live in a world full of AI systems including Siri, Alexa, GPS navigators, self-driving cars and so on. In the world of education big international companies are currently working on or already marketing AI systems that develop “intelligent instruction design and digital platforms that use AI to provide learning, testing and feedback to students”.

We need to begin to pay attention to how AI will impact pedagogy, curriculum and assessment in schools, that is, how it will impact end users (teachers and students). There is a lot to think about and talk about here already.

Artificial Intelligence in Education

Conversations about Artificial Intelligence in Education ( AIEd) have been going on for many years in the world of education. This year the London Festival of Learning organised by Professor Rose Luckin and her team brought together scholars from around the world in the fields of AIEd, Learning at Scale ( large scale online learning platforms) and the Learning Sciences.

Closer to home the NSW Department of Education has been on the front foot in raising awareness of AIEd in a series of papers in its Future Frontiers agenda. This is a compilation of essays that canvas “perspectives from thought leaders, technology experts and futurists from Australia and around the world.” These are helpful articles and thought pieces. They are worth checking out and can serve to inform nascent conversations you might want to have about AIEd.

Questions for schools and teachers

It is important for researchers and teacher educators like myself to explore how AIEd will supplement and change the nature of teachers’ work in schools. We need to understand how this can be done in education so that the human intelligence and the relational roles of teachers dominate.

How will schools be involved? And how could the changing education landscape be managed as the subject of AIEd attracts more attention?

Leading research scientist and world expert in AIEd at University College London, Professor Rose Luckin (who incidentally is a former teacher, school governor, and AI developer/computer scientist), captures the core argument when it comes to school education. She says: It’s more about how teachers and students will develop sufficient understanding of AIEd so that it can be augmented by human intelligence when determining what AIEd should and should not be designed to do. For example, Luckin suggests if only purely technological solutions dominate the agenda then what AIEd can offer for change and transformation in teaching and learning will be limited.

The Australian Government’s Innovation and Science Australia (2017) report, Australia 2030, recommends prioritisation of the “development of advanced capability in artificial intelligence and machine learning in the medium- to long-term to ensure growth of the cyber–physical economy”.

It also lists Education as one of its “five imperatives for the Australian innovation, science and research system” that will equip Australians with skills relevant to 2030, thus highlighting the need to understand the implications of AIEd for schools.

Critical moment for school education

There is conclusive international evidence that we are at a critical moment for setting clearer directions for AIEd in school education.

With crucial questions being asked internationally about AIEd and local reports like Australia 2030 published we must start to probe Australian policy makers, politicians, school principals, students and parents, as well as the teaching profession more broadly about such vital issues. Indeed the NSW Department of Education held a forum to this end in 2017 and I understand more are planned.

Schools are one focus of the agenda, but how are teacher education programs in universities preparing preservice teachers for this future? Are we considering questions of AI in our preparation programs? If we need to lift the skill levels of all school students to work in an AI world then what changes might we need to make to accommodate AI in school curriculum, assessment, pedagogy, workload and teacher professional learning?

The debate about robots replacing teachers is not the main event. There will be assistants in the form of a dashboard/s for instance but humans will still do all the things that machines cannot do.

Moreover there is also a great need for deeper understandings of learning analytics. There are also questions of opaque systems, bias in algorithms, and policy/governance questions around data ethics. Such topics could form foundational programs in teacher education courses.

More hard questions

What implications do AIEd and automated worlds have for school infrastructure? How can higher education and industry support schools to be responsive and supportive to this rapidly changing world of AI?

Leaping back to the London Festival of Learning for one moment, Professor Paulo Blikstein, from Stanford University, in his keynote address painted a grim picture of the dangers that lie ahead and he told his audience that it is time to ‘make hard choices for AIEd.’

He explained a phenomenon of We Will Take It From Here (WWTIFH) that happens to researchers. It is when tech businesses tell researchers to ‘go away and play with their toys’ and that they will take over and develop the work technologically … taking over things “in the most horrible way”. Blikstein outlined how most tech companies use algorithms that are impervious and don’t consult with the field – there are few policy or ethical guidelines in the US that oversee decision making in these areas – it’s a “dangerous cocktail” described by Blikstein’s formula of:

WWTIFH + Going Mainstream + Silicon Valley Culture + Huge Economic Potential = DANGER.

I agree with his caution in that people in positions of power in teaching and learning in education need to be aware of the limitations of AI. It can help decision makers but not make decisions for them. This awareness becomes increasingly important as educational leaders interact and work more frequently with tech companies.

In teacher education in Australian universities we must begin to talk more about AIEd with those whom we teach and research. We should be thinking all the time about what AI really is and not be naïve and privilege AI over humans.

As you might sense, I believe this is a serious and necessary dialogue. There are many participants in the AIEd conversation and those involved in education at all levels in Australian schools have an important voice.


Dr Jane Hunter is an early career researcher in the STEM Education Futures Research Centre, Faculty of Arts and Social Sciences at the University of Technology Sydney. She was a classroom teacher and head teacher in schools both in Australia and the UK. Jane is conducting a series of STEM studies focused on building teacher capacity; in this work she leads teachers, school principals, students and communities to better understand and support education change. Her work in initial teacher education has received national and international recognition with a series of teaching awards for outstanding contributions to student learning. She enjoys writing and her research-based presentations at national and international conferences challenge audiences to consider new and alternate education possibilities. A recent podcast with Jane on AIEd can be heard here. Follow her on Twitter @janehunter01


Note from the editor: The counters on our sharing buttons are broken ( just the counters not the sharing function). Hopefully they will be working again soon.