November.2.2023

To understand AI today, we need both why and how

By Paul Kidson

We know AI is such a big deal that just this week the President of the United States, Joe Biden, signed an executive order to try to address the risks of a technology he described as “the most consequential technology of our time”.

So it is no wonder that the proliferation of both AI tools and of conferences during 2023 continues unabated.

And how seriously are we taking the challenge of AI in Australia? Our focus is disproportionately focused on “how”, while larger questions of “why” seem opaque. 

Now is a good time to reflect on where we are with AI. We might now have much greater capacity to generate data, but whether this is leading to knowledge, let alone wisdom, is up for serious debate.

A time to reflect

The number of AI tools and their applications to education is overwhelming, and certainly way beyond initial angst about ChatGPT and cheating that set the tone for the start of the 2023 academic year. 

But, as Maslow once wisely mused, only having a hammer makes us see every problem as a nail. If we have these powerful technologies, knowing how to use them can’t be the only issue. We need to talk more about why and when we use them. This goes to the heart of what we hold as the purposes of education. 

The case of the smartphone provides a useful comparison. First launched in 1992, it took until 2007 for the iPhone to disrupt the technology conversation. Some dreamed of, and seized, the opportunities in education such a device enabled. Others exercised caution, waiting to follow the early adopters only once the path was cleared.

UNESCO advice

Sixteen years later, though, responses have sharpened. UNESCO recently advised that smartphones should only be used where they benefit learning, advice that admittedly seems self-evident. It has taken so long for such a statement to emerge, though, it suggests the “tool” is having ongoing impacts well beyond learning. Sadly, too many examples from schools attest to the harnessing of smartphone power for abusive and manipulative purposes, particularly with sexual violence. The rise of AI has only exacerbated some of these concerns.

The potent combination of learning disengagement and social dysfunction continues to create challenges for how technology is used in schools. There is a rising chorus in support of more handwriting. Some jurisdictions have moved to wholesale banning of mobile phones at school

How we’ve dealt with smartphones should give us pause for reflection, particularly when some early warning signs about AI are clearly evident. 

When AI whistleblower, Timnit Gebru, first started in AI research, she lamented the lack of cultural and gender diversity amongst developers. Things have improved, no doubt, but cultural and social bias remain significant problems to be addressed.

Flat-footed prose

Much lauded creative possibilities of generative AI are still needing development, and also come with serious ethical questions. Margaret Atwood recently lamented the lack of creative artistry of outputs based on her own works, concluding that its “flat-footed prose was the opposite of effective storytelling”. 

Worse, she argued, was that the texts used to train these models were not even purchased by the company, instead relying on versions scraped – stolen – from the internet. That, in turn, meant any royalty payments she might otherwise have earned were withheld. Australian authors have similarly expressed their frustration. Eking out an existence as an author is challenging enough without pirated works further stealing from these vital cultural voices.

We seem to have a larger challenge, too, buried deep in little discussed PISA data. Much of the focus on PISA is about test results.

Sobering results

But here’s what is in Volume III : students’ perceptions about bigger existential questions on the meaning of life, purpose, and satisfaction. The results, all of which are below the OECD average, are sobering:

  • 37% of students disagreed or strongly disagreed that “my life has meaning and purpose”;
  • 42% of students disagreed or strongly disagreed that “I have discovered a satisfactory meaning in life”;
  • 36% of students disagreed or strongly disagreed that “I have a clear sense of what gives meaning to my life”.

And this data was collected before the traumas of Black Summer in 2019 and COVID-19. There is much anticipation about what story the more recent round of PISA data collection will tell.

Based on this data, we clearly have much more work to do on our second national educational goal to develop confident and creative individuals who “have a sense of self-worth, self-awareness and personal identity that enables them to manage their emotional, mental, cultural, spiritual and physical wellbeing”. 

What can AI do in pursuit of these goals?

Much of the conversation about AI has been focused on the first part of the first national educational goal – excellence. How can AI be used to improve student learning? How can AI reshape teaching and assessment? More remains to be done on how AI can address the second part – equity.

These concerns are echoed by UNESCO in its recent Global Education Monitoring Report. The opportunities afforded by AI raise new questions about what it means to be educated. Technology is the tool, not the goal, argues the report. AI is to be in the service of developing “learners’ responsibility, empathy, moral compass, creativity and collaboration”.

AI will no doubt bring new possibilities and efficiencies into education, and to that end should be embraced. At the same time, a better test for its value might be that posed recently by Gert Biesta, that we must not:

lose sight of the fact that children and young people are human beings who face the challenge of living their own life, and of trying to live it well.

Attraction to the new, the shiny, the ephemeral, the how, is to be tempered by more fundamental questions of why. Keeping this central to the conversation might prevent us from realising Arendt’s prophecy that our age may exhibit “the deadliest, most sterile passivity history has ever known”.

Dr Paul Kidson is a senior lecturer in Educational Leadership at the Australian Catholic University. Prior to becoming an academic in 2017, he was a school principal for over 11 years. His teaching and research explore how systems and policies govern the work of school leaders, as well as how school leaders develop and sustain their personal leadership story. He previously wrote about artificial intelligence for EduResearch Matters with Sarah Jefferson and Leon Furze here.

Republish this article for free, online or in print, under Creative Commons licence.

One thought on “To understand AI today, we need both why and how

  1. Educators have ignored the development of AI. It is here now in real products for everyday use. This can’t be ignored any longer. Time to face the challenge and start dealing with the issues. There are other disruptive technologies in development by my colleagues in university computing departments.

Comments are closed.