November.18.2024

Teachers truly know students and how they learn. Does AI?

By Sue Ollerhead

Time-strapped teachers are turning to advanced AI models like ChatGPT and Perplexity to streamline lesson planning. Simply by entering prompts like “Generate a comprehensive three-lesson sequence on geographical landforms,” they can quickly receive a detailed teaching program tailored to the lesson content, complete with learning outcomes, suggested resources, classroom management tips and more.

What’s not to like? This approach represents a pragmatic solution to educators’ overwhelming workloads. It also explains the rapid adoption of AI-driven planning tools by both schoolteachers and the universities that train them.  

And what do we say to the naysayers? Don’t waste your time raging against the machine. AI is here! AI is the future! 

Can AI know students and how they learn?

But what does wide-scale AI adoption mean for the fundamental skills and knowledge that lie at the heart of teaching – those that inform the Australian Professional Standards for Teachers? Take Standard 1.3, for example, “Know Students and how they learn”. This standard requires teachers to show that they understand teaching strategies that respond to the learning strengths and needs of students from diverse linguistic, cultural, religious, and socioeconomic backgrounds. Can AI handle this type of differentiation effectively? 

Of course! Teachers simply need to add the following phrase to the original prompt: “The lesson sequence should include strategies that differentiate for students from culturally and linguistically diverse backgrounds”. Hey presto! The revised lesson sequence now incorporates strategies such as getting students to write a list of definitions for key terms,  using scaffolding techniques, implementing explicit teaching, and allowing students to use their home languages from time to time

Even better, AI can create a worksheet that includes thoughtful questions such as, “What are some important landforms in your home country?”, “What do you call this type of landform in your home language?” and so on. With these modifications, we have effectively achieved differentiation for a culturally and linguistically diverse classroom. Problem solved! 

Can AI deal with the mix?

Or have we? Can AI truly comprehend the complexities of diversity within a single classroom? Consider this scenario: you are a teacher in western Sydney, where 95 per cent of your class comes from a Language Background other than English (LBOTE). This is not uncommon in NSW, where one in three students belongs to this category. 

Your class comprises a mix of high-achieving, gifted and talented individuals—some of whom are expert English users, while others are new arrivals who have been assessed as “Emerging” on the EALD Learning Progression. These students need targeted language support to comprehend the curriculum. 

Your students come from various backgrounds. Some are Aboriginal Australian students, while others come from Sudan, China, Afghanistan, Nepal, and Bangladesh. Some have spent over three years in refugee camps before arriving in Australia, with no access to formal education. Others live in Sydney without their families. Some are highly literate, while others have yet to master basic academic literacy skills in English.

Going beyond

In this context, simply handing out a worksheet and expecting students to write about landforms in their “home country” can be an overwhelming and confusing task. For some students, being asked to write or speak in their “home language” while the rest of the class communicates in English may trigger discomfort or even traumatic memories related to the conflicts they have escaped. Recognising these nuances is essential for effective differentiation and raises important questions about whether AI can sufficiently navigate the complexities of such diverse classrooms. 

Teachers must go beyond merely knowing their students’ countries of origin; they need to delve into their background stories. This includes appreciating and encouraging the language and cultural resources that students bring to the classroom—often referred to as their virtual schoolbag. Additionally, educators must recognise that access to material resources, such as technology and reading materials, can vary significantly among students. Understanding how students’ religious backgrounds may influence their perspectives and engagement with the content is equally important. Only by taking these factors into account can teachers create a truly inclusive and responsive learning environment.

Then there’s the content itself. Teachers need to critically evaluate the content they plan to teach by asking themselves several important questions. That includes: What are my own biases and blind spots related to this subject matter? What insights might my students have that I am unaware of? What sensitivities could arise in discussions about this content concerning values, knowledge, and language? Most importantly, how can I teach this material in a culturally and linguistically responsive  manner that promotes my students’ well-being and achievement?

One overarching concern

All of these questions point to one overarching concern: Can AI truly address all of these considerations, or are they essential to the inherently human and relational nature of teaching?

Australian linguist and emeritus professor of language and literacy education at the Melbourne Graduate School of Education Joseph Lo Bianco says the benefits of AI have been significantly overstated when it comes to addressing language and culture effectively in classroom teaching. 

Although AI excels at transmitting and synthesising information, it cannot replace the essential interpersonal connections and subjectivity necessary for authentic intercultural understanding. The emotions, creativity, and personalised approaches essential for meaningful teaching and learning are inherently human qualities. 

AI, an aid not a replacement

While AI tools like ChatGPT and Perplexity offer impressive efficiencies for lesson planning, they cannot replace the nuanced understanding and relational dynamics that define effective teaching in culturally and linguistically diverse classrooms. Teachers need to recognise that AI can aid in differentiation but lacks the capacity to fully comprehend students’ individual experiences, histories, and emotional landscapes. The complexities of student backgrounds, the significance of personal narratives, and the critical need for empathetic engagement cannot be reduced to algorithms. 

As we embrace AI in education, we must remain vigilant in advocating for a pedagogical approach that prioritises human connection and cultural responsiveness. Ultimately, teacher AI literacy should encompass not just the technical skills to integrate AI into classrooms but also the profound understanding of students as whole individuals, fostering an inclusive environment that values each learner’s unique contributions. In this way, we can harness the power of technology while ensuring it complements the irreplaceable art of teaching.



Sue Ollerhead is a senior lecturer in Languages and Literacy Education and the Director of the Secondary Education Program at Macquarie University. Her expertise lies in English language and literacy learning and teaching in multicultural and multilingual education contexts. Her research interests include translanguaging, multilingual pedagogies, literacy across the curriculum and oracy development in schools. 

Republish this article for free, online or in print, under Creative Commons licence.

8 thoughts on “Teachers truly know students and how they learn. Does AI?

  1. Rachael Nicholson says:

    Thanks for your article. Just wondering what research you are referring to that indicates teachers are using AI? Dan Meyer (maths educator from the US) frequently posts about the very low uptake of AI by teachers in the US.

  2. Sue Ollerhead says:

    Thanks Rachael
    An AHISA (Association of Heads of Independent Schools of Australia) survey of its teachers’ use of AI reported over 70 percent of primary teachers and 80 per cent of secondary teachers were using generative AI tools in their work. Lesson planning or learning design was rated as the top AI-assisted task. This survey dates back to August 2023, so one could assume the uptake is even greater by now. In my work as a secondary teacher educator, my observations of AI use amongst teachers across government, independent and Catholic sectors, certainly support these findings.

  3. Ania Lian says:

    Thanks. The quality of AI’ s responses will depend on the quality of the information the AI is given. That teachers use AI is interesting because it is an opportunity to examine critically what we mean by the things we say. For example, if teachers indeed “Know Students and how they learn”, they can tell it to chatgpt. If they know what differentiation means, they can tell it to chatgpt. And so on. It would be a good start to explore what educators, both teachers and researchers, feed AI with and what difference the quality of that input makes.
    Ania Lian

  4. Sue Ollerhead says:

    Thanks Ania
    I completely agree. This is why I think we need to think long and hard about what AI-literacy means in teacher education. AI can only provide a quality lesson plan for a culturally and linguistically diverse classroom insofar as it is given a nuanced diversity profile to work with. This profile can only come from the teacher, and depends on their ability to get to know the students and what their strengths and needs are.

  5. We set teachers tests to decide if they have the skills and knowledge required. If AI can pass the same tests, this is good enough to have the confidence the tools are useful.

  6. Sue Ollerhead says:

    Thanks Tom
    As far as I know there aren’t any tests designed yet to assess whether AI can ‘know’ the diversity profile of a specific class without meeting the students. i.e. their cultural background, histories, language proficiency, strengths and learning needs. That work needs to be done by the teacher.

  7. Standard 1.3, for example, “Know Students and how they learn”

    This is a standard, a measurement tool, or in other word a a goal. It does not mean all human a meet it. Human bias is well documented.

    I would submit that we asked AI to drive cars, diagnose diseases, write essays … and it learned to do it. I hope AI can be more objective and intellectually present more often than human a because it is not limited by the human condition.

  8. Sue Ollerhead says:

    Thanks for your comment Szymon. But I’m not sure I agree that we’re in a zero-sum game here, where humans have to compete with AI or perish? My feeling is that are there some things that AI does really well (like mine endless quantities of information and express that in writing, images etc) and some things that humans (teachers), do better (like empathy, judiciousness, dialogue, etc). Could the way forward possibly be for humans to work together with AI by bringing what we do best (i.e. our humanity) to bear on a phenomenon that is already far better than us at generating any numbers of pat answers to big questions. At the end of the day, we still need to use our human judgement to decide which answer is best? Just a thought 🙂

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from EduResearch Matters

Subscribe now to keep reading and get access to the full archive.

Continue reading