AI

Can we preserve human agency in a world of AI?

That’s a question we can all ask ourselves as we interrogate the UN International Day of Education. This year’s theme is AI and education. What does the teaching profession gain and lose with Gen AI?

Two hundred million people use ChatGPT each month, with growth doubling in one year. A recent article from Harvard Business Review correctly identifies that generative AI (Gen AI) is a prediction machine that can summarise, synthesise, code, and draw based on its training with the corpus of knowledge from the internet and custom data sets.  The article points out that: “The efficacy of predictions is contingent on the underlying data. The quality and quantity of data significantly impact the accuracy of AI predictions…. (T)he successful implementation of AI requires good judgment…  It involves knowing which predictions to make, the costs associated with different types of mistakes, and what to do with the AI’s outcome… Judgment over what matters in a particular situation is fundamental to the successful use of generative AI.”

Time to ask those questions about AI again

Anecdote and research suggest that students in schools and universities increasingly use Gen AI tools in various ways to undertake learning and assessment.  There has been a flurry of activity by government, state departments and regulators in providing policy, guidelines and resources for educators and students on the technology. Discourse has seemed to turn a corner from using Gen AI as “cheating” to either adjusting assessment by having students apply or adapt AI text to the real world or embracing AI outputs by critiquing or improving on them.

Two years after the widespread uptake of ChatGPT, the most popular Gen AI tool, I think it is time to pause and re-ask ourselves as educators what exactly do my students need to know and be able to do to demonstrate competency in learning. This question is the true core of curriculum. And it goes directly to thinking creatively or innovatively in education.

There is a lot of talk about Gen AI augmenting human intelligence with its efficient summary outputs. As the Harvard Business review article points out, such outputs require “good judgment” in order to assess the quality for a “particular situation”.

Still a place for old-fashioned exams

Educators can certainly set tasks where students generate such outputs and develop skills to assess output quality. Of course this means explicitly teaching those quality assessment skills (research, information literacy, critical thinking) and then having some way of knowing if students are using/developing the skills without using Gen AI to produce a fake trail of skill development. This might involve seeing drafts of work with commentary on how the skills were used with real time presentations on this coupled with teacher and peer questions. There is still a place for old fashioned exams too as one way to assess knowledge acquisition and transfer, as unpopular as this may be in some AI evangelical circles.

If we want students to be more than adept prompt jockeys, then we have to really think about how we want them to demonstrate learning.

Software that purports to provide AI generator matches are pretty hopeless and give warning about this, so teachers shouldn’t rely on these but on carefully developed dialogue and iterative processes with students. In other words, carefully crafted learning and assessment activities and knowing their students well. This is easier in schools than in universities where large cohorts, online learning, intensified academic workloads and a highly casualised workforce act as barriers to developing genuine, long term educator-student connections.

On standards

Now, let me unpack the issue from my context as a teacher educator. Australian teachers have a set of standards they need to meet at various career stages. The curriculum of teacher education needs to directly respond to these standards and teacher education is commonly structured according to : (1) content (discipline) knowledge, (2) method which is curriculum, pedagogy and assessment, (3) understanding learning and the learning context of students (educational psychology and sociology), and (4) how 1-3 translate into practice  through professional experience known as practicums and internships. Summarising and synthesising from the corpus of the internet, Gen AI can easily produce outputs for assessment related to areas 1- 3.

Teachers are great sharers. It is a human-centred, collaborative profession after all. For a long time teachers everywhere have shared curriculum scope and sequence documents, unit and lesson plans, assessments, teaching resources, and student work samples online. Student teachers, usually referred to as pre-service teachers, have a vast repository of exemplars and resources to draw on, modify and use for assessment at university and at practicum. Plagiarism checkers could and can still identify if a student has directly copied something from the internet and not cited the source or tried to pass it off as their own.

Questioning the quality of the AI output

Gen AI, drawing on all the teaching curriculum resources available online, can almost instantly produce scope and sequence, units of work, lesson plans and resources such as work sheets, by predicting what the user wants according to the prompt and synthesising or summarising what is available online. It is then up to the pre-service teacher or teacher to make judgements about the quality of the AI output in relation to the task or the appropriateness for the learning context.

Teachers who have gone through traditional method courses at university – learning to first read a syllabus for structure and meaning, and then translating this into a lesson plan, a unit of work and a scope and sequence through a carefully scaffolded developmental arrangement of courses across a degree  – are mostly well equipped to make professional judgements about automated outputs from gen AI. However, we are entering a new era where it may be possible to produce work for discipline, method and learning courses without having to think critically or authentically about what is submitted for assessment.

There will be a sizeable proportion of students graduating from universities who would have relied on Gen AI outputs in an expedient or shallow way to get through their degree having been exposed to limited opportunities that “test” depth of understanding, application and transfer and creative or innovative thinking. Universities won’t want to talk about this for a long time – just as they were slow to address the impact of essay mills. But it will be a phenomenon which will shape trust in higher education institutions and ultimately professions.

In teacher education this could mean a heavier burden for teachers supervising students on practicum. In the world of Gen AI these supervising teachers are well placed to evaluate whether a student has developed competency through their application of discipline, curriculum and pedagogy, and learner knowledge.

There are many, many teachers using Gen AI to generate curriculum material, school reports, newsletters and other artefacts considered ripe for an efficiency overall in their time-poor day. If Gen AI were to cease tomorrow, I would hazard a guess that the vast majority could still create these texts as they have gone through the sequenced training prior to and in-service, and have experience to draw on, including the experiences of other teachers.

However, we may be entering an era where there will be the first cohorts of teachers who have come to rely on Gen AI to a point that they did not develop these skills or the necessary judgement vital in designing curriculum to suit context. Gen AI raises a lot of questions related to professional knowledge and standards.

Will pre-service and practising teachers develop AI dependency? Will this erode the unique combination of professional skills teachers have? Does this matter? Should we augment our competencies and intelligence and redefine the fundamentals of professional knowledge?

AI: it’s about what exists, not what’s possible

Finally, what will happen to innovation in curriculum design if pre-service and in-service teachers slowly stop drawing on their vast cognitive resources to create and share new unit plans or teaching resources, instead relying on the quick Gen AI fix? We need to remember that Gen AI is a summarising and synthesising tool, predicting a response from a prompt to communicate what already exists not what is possible.

Let’s start having a more serious and sustained conversation in teacher education and the teaching profession about what we gain as educators in using Gen AI and what we potentially erode, lose or irrevocably change, and will it matter for our students?

To return to my original question but orienting it towards the training of pre-service teachers – what exactly do pre-service teachers need to know and be able to do to demonstrate competency with and without Gen AI? This question surely goes to the heart of teaching standards.

Erica Southgate is an associate professor in the School of Education, University of Newcastle. She makes computer games for literacy and is an education technology ethicist and an immersive learning researcher. 

Teachers truly know students and how they learn. Does AI?

Time-strapped teachers are turning to advanced AI models like ChatGPT and Perplexity to streamline lesson planning. Simply by entering prompts like “Generate a comprehensive three-lesson sequence on geographical landforms,” they can quickly receive a detailed teaching program tailored to the lesson content, complete with learning outcomes, suggested resources, classroom management tips and more.

What’s not to like? This approach represents a pragmatic solution to educators’ overwhelming workloads. It also explains the rapid adoption of AI-driven planning tools by both schoolteachers and the universities that train them.  

And what do we say to the naysayers? Don’t waste your time raging against the machine. AI is here! AI is the future! 

Can AI know students and how they learn?

But what does wide-scale AI adoption mean for the fundamental skills and knowledge that lie at the heart of teaching – those that inform the Australian Professional Standards for Teachers? Take Standard 1.3, for example, “Know Students and how they learn”. This standard requires teachers to show that they understand teaching strategies that respond to the learning strengths and needs of students from diverse linguistic, cultural, religious, and socioeconomic backgrounds. Can AI handle this type of differentiation effectively? 

Of course! Teachers simply need to add the following phrase to the original prompt: “The lesson sequence should include strategies that differentiate for students from culturally and linguistically diverse backgrounds”. Hey presto! The revised lesson sequence now incorporates strategies such as getting students to write a list of definitions for key terms,  using scaffolding techniques, implementing explicit teaching, and allowing students to use their home languages from time to time

Even better, AI can create a worksheet that includes thoughtful questions such as, “What are some important landforms in your home country?”, “What do you call this type of landform in your home language?” and so on. With these modifications, we have effectively achieved differentiation for a culturally and linguistically diverse classroom. Problem solved! 

Can AI deal with the mix?

Or have we? Can AI truly comprehend the complexities of diversity within a single classroom? Consider this scenario: you are a teacher in western Sydney, where 95 per cent of your class comes from a Language Background other than English (LBOTE). This is not uncommon in NSW, where one in three students belongs to this category. 

Your class comprises a mix of high-achieving, gifted and talented individuals—some of whom are expert English users, while others are new arrivals who have been assessed as “Emerging” on the EALD Learning Progression. These students need targeted language support to comprehend the curriculum. 

Your students come from various backgrounds. Some are Aboriginal Australian students, while others come from Sudan, China, Afghanistan, Nepal, and Bangladesh. Some have spent over three years in refugee camps before arriving in Australia, with no access to formal education. Others live in Sydney without their families. Some are highly literate, while others have yet to master basic academic literacy skills in English.

Going beyond

In this context, simply handing out a worksheet and expecting students to write about landforms in their “home country” can be an overwhelming and confusing task. For some students, being asked to write or speak in their “home language” while the rest of the class communicates in English may trigger discomfort or even traumatic memories related to the conflicts they have escaped. Recognising these nuances is essential for effective differentiation and raises important questions about whether AI can sufficiently navigate the complexities of such diverse classrooms. 

Teachers must go beyond merely knowing their students’ countries of origin; they need to delve into their background stories. This includes appreciating and encouraging the language and cultural resources that students bring to the classroom—often referred to as their virtual schoolbag. Additionally, educators must recognise that access to material resources, such as technology and reading materials, can vary significantly among students. Understanding how students’ religious backgrounds may influence their perspectives and engagement with the content is equally important. Only by taking these factors into account can teachers create a truly inclusive and responsive learning environment.

Then there’s the content itself. Teachers need to critically evaluate the content they plan to teach by asking themselves several important questions. That includes: What are my own biases and blind spots related to this subject matter? What insights might my students have that I am unaware of? What sensitivities could arise in discussions about this content concerning values, knowledge, and language? Most importantly, how can I teach this material in a culturally and linguistically responsive  manner that promotes my students’ well-being and achievement?

One overarching concern

All of these questions point to one overarching concern: Can AI truly address all of these considerations, or are they essential to the inherently human and relational nature of teaching?

Australian linguist and emeritus professor of language and literacy education at the Melbourne Graduate School of Education Joseph Lo Bianco says the benefits of AI have been significantly overstated when it comes to addressing language and culture effectively in classroom teaching. 

Although AI excels at transmitting and synthesising information, it cannot replace the essential interpersonal connections and subjectivity necessary for authentic intercultural understanding. The emotions, creativity, and personalised approaches essential for meaningful teaching and learning are inherently human qualities. 

AI, an aid not a replacement

While AI tools like ChatGPT and Perplexity offer impressive efficiencies for lesson planning, they cannot replace the nuanced understanding and relational dynamics that define effective teaching in culturally and linguistically diverse classrooms. Teachers need to recognise that AI can aid in differentiation but lacks the capacity to fully comprehend students’ individual experiences, histories, and emotional landscapes. The complexities of student backgrounds, the significance of personal narratives, and the critical need for empathetic engagement cannot be reduced to algorithms. 

As we embrace AI in education, we must remain vigilant in advocating for a pedagogical approach that prioritises human connection and cultural responsiveness. Ultimately, teacher AI literacy should encompass not just the technical skills to integrate AI into classrooms but also the profound understanding of students as whole individuals, fostering an inclusive environment that values each learner’s unique contributions. In this way, we can harness the power of technology while ensuring it complements the irreplaceable art of teaching.



Sue Ollerhead is a senior lecturer in Languages and Literacy Education and the Director of the Secondary Education Program at Macquarie University. Her expertise lies in English language and literacy learning and teaching in multicultural and multilingual education contexts. Her research interests include translanguaging, multilingual pedagogies, literacy across the curriculum and oracy development in schools. 

To understand AI today, we need both why and how

We know AI is such a big deal that just this week the President of the United States, Joe Biden, signed an executive order to try to address the risks of a technology he described as “the most consequential technology of our time”.

So it is no wonder that the proliferation of both AI tools and of conferences during 2023 continues unabated.

And how seriously are we taking the challenge of AI in Australia? Our focus is disproportionately focused on “how”, while larger questions of “why” seem opaque. 

Now is a good time to reflect on where we are with AI. We might now have much greater capacity to generate data, but whether this is leading to knowledge, let alone wisdom, is up for serious debate.

A time to reflect

The number of AI tools and their applications to education is overwhelming, and certainly way beyond initial angst about ChatGPT and cheating that set the tone for the start of the 2023 academic year. 

But, as Maslow once wisely mused, only having a hammer makes us see every problem as a nail. If we have these powerful technologies, knowing how to use them can’t be the only issue. We need to talk more about why and when we use them. This goes to the heart of what we hold as the purposes of education. 

The case of the smartphone provides a useful comparison. First launched in 1992, it took until 2007 for the iPhone to disrupt the technology conversation. Some dreamed of, and seized, the opportunities in education such a device enabled. Others exercised caution, waiting to follow the early adopters only once the path was cleared.

UNESCO advice

Sixteen years later, though, responses have sharpened. UNESCO recently advised that smartphones should only be used where they benefit learning, advice that admittedly seems self-evident. It has taken so long for such a statement to emerge, though, it suggests the “tool” is having ongoing impacts well beyond learning. Sadly, too many examples from schools attest to the harnessing of smartphone power for abusive and manipulative purposes, particularly with sexual violence. The rise of AI has only exacerbated some of these concerns.

The potent combination of learning disengagement and social dysfunction continues to create challenges for how technology is used in schools. There is a rising chorus in support of more handwriting. Some jurisdictions have moved to wholesale banning of mobile phones at school

How we’ve dealt with smartphones should give us pause for reflection, particularly when some early warning signs about AI are clearly evident. 

When AI whistleblower, Timnit Gebru, first started in AI research, she lamented the lack of cultural and gender diversity amongst developers. Things have improved, no doubt, but cultural and social bias remain significant problems to be addressed.

Flat-footed prose

Much lauded creative possibilities of generative AI are still needing development, and also come with serious ethical questions. Margaret Atwood recently lamented the lack of creative artistry of outputs based on her own works, concluding that its “flat-footed prose was the opposite of effective storytelling”. 

Worse, she argued, was that the texts used to train these models were not even purchased by the company, instead relying on versions scraped – stolen – from the internet. That, in turn, meant any royalty payments she might otherwise have earned were withheld. Australian authors have similarly expressed their frustration. Eking out an existence as an author is challenging enough without pirated works further stealing from these vital cultural voices.

We seem to have a larger challenge, too, buried deep in little discussed PISA data. Much of the focus on PISA is about test results.

Sobering results

But here’s what is in Volume III : students’ perceptions about bigger existential questions on the meaning of life, purpose, and satisfaction. The results, all of which are below the OECD average, are sobering:

  • 37% of students disagreed or strongly disagreed that “my life has meaning and purpose”;
  • 42% of students disagreed or strongly disagreed that “I have discovered a satisfactory meaning in life”;
  • 36% of students disagreed or strongly disagreed that “I have a clear sense of what gives meaning to my life”.

And this data was collected before the traumas of Black Summer in 2019 and COVID-19. There is much anticipation about what story the more recent round of PISA data collection will tell.

Based on this data, we clearly have much more work to do on our second national educational goal to develop confident and creative individuals who “have a sense of self-worth, self-awareness and personal identity that enables them to manage their emotional, mental, cultural, spiritual and physical wellbeing”. 

What can AI do in pursuit of these goals?

Much of the conversation about AI has been focused on the first part of the first national educational goal – excellence. How can AI be used to improve student learning? How can AI reshape teaching and assessment? More remains to be done on how AI can address the second part – equity.

These concerns are echoed by UNESCO in its recent Global Education Monitoring Report. The opportunities afforded by AI raise new questions about what it means to be educated. Technology is the tool, not the goal, argues the report. AI is to be in the service of developing “learners’ responsibility, empathy, moral compass, creativity and collaboration”.

AI will no doubt bring new possibilities and efficiencies into education, and to that end should be embraced. At the same time, a better test for its value might be that posed recently by Gert Biesta, that we must not:

lose sight of the fact that children and young people are human beings who face the challenge of living their own life, and of trying to live it well.

Attraction to the new, the shiny, the ephemeral, the how, is to be tempered by more fundamental questions of why. Keeping this central to the conversation might prevent us from realising Arendt’s prophecy that our age may exhibit “the deadliest, most sterile passivity history has ever known”.

Dr Paul Kidson is a senior lecturer in Educational Leadership at the Australian Catholic University. Prior to becoming an academic in 2017, he was a school principal for over 11 years. His teaching and research explore how systems and policies govern the work of school leaders, as well as how school leaders develop and sustain their personal leadership story. He previously wrote about artificial intelligence for EduResearch Matters with Sarah Jefferson and Leon Furze here.

Artificial Intelligence (AI) in schools: are you ready for it? Let’s talk

Interest in the use of Artificial Intelligence (AI) in schools is growing. More educators are participating in important conversations about it as understanding develops around how AI will impact the work of teachers and schools.

In this post I want to add to the conversation by raising some issues and putting forward some questions that I believe are critical. To begin I want to suggest a definition of the term ‘Artificial Intelligence’ or AI as it is commonly known.

What do we mean by ‘Artificial Intelligence’?

Definitions are tricky because the field is so interdisciplinary, that is it relates to many different branches of knowledge including computer science, education, game design and psychology, just to name a few.

I like the definition offered by Swedish-American physicist and cosmologist Max Tegmark. He describes Artificial Intelligence systems as being ‘narrowly intelligent because while they are able to accomplish complex goals, each AI system is only able to accomplish goals that are very specific.’

I like this definition because it mentions how complex AI can be but makes us focus on the reality that AI is narrowly focused to fulfill specific goals.

We already live in a world full of AI systems including Siri, Alexa, GPS navigators, self-driving cars and so on. In the world of education big international companies are currently working on or already marketing AI systems that develop “intelligent instruction design and digital platforms that use AI to provide learning, testing and feedback to students”.

We need to begin to pay attention to how AI will impact pedagogy, curriculum and assessment in schools, that is, how it will impact end users (teachers and students). There is a lot to think about and talk about here already.

Artificial Intelligence in Education

Conversations about Artificial Intelligence in Education ( AIEd) have been going on for many years in the world of education. This year the London Festival of Learning organised by Professor Rose Luckin and her team brought together scholars from around the world in the fields of AIEd, Learning at Scale ( large scale online learning platforms) and the Learning Sciences.

Closer to home the NSW Department of Education has been on the front foot in raising awareness of AIEd in a series of papers in its Future Frontiers agenda. This is a compilation of essays that canvas “perspectives from thought leaders, technology experts and futurists from Australia and around the world.” These are helpful articles and thought pieces. They are worth checking out and can serve to inform nascent conversations you might want to have about AIEd.

Questions for schools and teachers

It is important for researchers and teacher educators like myself to explore how AIEd will supplement and change the nature of teachers’ work in schools. We need to understand how this can be done in education so that the human intelligence and the relational roles of teachers dominate.

How will schools be involved? And how could the changing education landscape be managed as the subject of AIEd attracts more attention?

Leading research scientist and world expert in AIEd at University College London, Professor Rose Luckin (who incidentally is a former teacher, school governor, and AI developer/computer scientist), captures the core argument when it comes to school education. She says: It’s more about how teachers and students will develop sufficient understanding of AIEd so that it can be augmented by human intelligence when determining what AIEd should and should not be designed to do. For example, Luckin suggests if only purely technological solutions dominate the agenda then what AIEd can offer for change and transformation in teaching and learning will be limited.

The Australian Government’s Innovation and Science Australia (2017) report, Australia 2030, recommends prioritisation of the “development of advanced capability in artificial intelligence and machine learning in the medium- to long-term to ensure growth of the cyber–physical economy”.

It also lists Education as one of its “five imperatives for the Australian innovation, science and research system” that will equip Australians with skills relevant to 2030, thus highlighting the need to understand the implications of AIEd for schools.

Critical moment for school education

There is conclusive international evidence that we are at a critical moment for setting clearer directions for AIEd in school education.

With crucial questions being asked internationally about AIEd and local reports like Australia 2030 published we must start to probe Australian policy makers, politicians, school principals, students and parents, as well as the teaching profession more broadly about such vital issues. Indeed the NSW Department of Education held a forum to this end in 2017 and I understand more are planned.

Schools are one focus of the agenda, but how are teacher education programs in universities preparing preservice teachers for this future? Are we considering questions of AI in our preparation programs? If we need to lift the skill levels of all school students to work in an AI world then what changes might we need to make to accommodate AI in school curriculum, assessment, pedagogy, workload and teacher professional learning?

The debate about robots replacing teachers is not the main event. There will be assistants in the form of a dashboard/s for instance but humans will still do all the things that machines cannot do.

Moreover there is also a great need for deeper understandings of learning analytics. There are also questions of opaque systems, bias in algorithms, and policy/governance questions around data ethics. Such topics could form foundational programs in teacher education courses.

More hard questions

What implications do AIEd and automated worlds have for school infrastructure? How can higher education and industry support schools to be responsive and supportive to this rapidly changing world of AI?

Leaping back to the London Festival of Learning for one moment, Professor Paulo Blikstein, from Stanford University, in his keynote address painted a grim picture of the dangers that lie ahead and he told his audience that it is time to ‘make hard choices for AIEd.’

He explained a phenomenon of We Will Take It From Here (WWTIFH) that happens to researchers. It is when tech businesses tell researchers to ‘go away and play with their toys’ and that they will take over and develop the work technologically … taking over things “in the most horrible way”. Blikstein outlined how most tech companies use algorithms that are impervious and don’t consult with the field – there are few policy or ethical guidelines in the US that oversee decision making in these areas – it’s a “dangerous cocktail” described by Blikstein’s formula of:

WWTIFH + Going Mainstream + Silicon Valley Culture + Huge Economic Potential = DANGER.

I agree with his caution in that people in positions of power in teaching and learning in education need to be aware of the limitations of AI. It can help decision makers but not make decisions for them. This awareness becomes increasingly important as educational leaders interact and work more frequently with tech companies.

In teacher education in Australian universities we must begin to talk more about AIEd with those whom we teach and research. We should be thinking all the time about what AI really is and not be naïve and privilege AI over humans.

As you might sense, I believe this is a serious and necessary dialogue. There are many participants in the AIEd conversation and those involved in education at all levels in Australian schools have an important voice.

 

Dr Jane Hunter is an early career researcher in the STEM Education Futures Research Centre, Faculty of Arts and Social Sciences at the University of Technology Sydney. She was a classroom teacher and head teacher in schools both in Australia and the UK. Jane is conducting a series of STEM studies focused on building teacher capacity; in this work she leads teachers, school principals, students and communities to better understand and support education change. Her work in initial teacher education has received national and international recognition with a series of teaching awards for outstanding contributions to student learning. She enjoys writing and her research-based presentations at national and international conferences challenge audiences to consider new and alternate education possibilities. A recent podcast with Jane on AIEd can be heard here. Follow her on Twitter @janehunter01

 

Note from the editor: The counters on our sharing buttons are broken ( just the counters not the sharing function). Hopefully they will be working again soon.