Smart glasses are the latest shiny object in the edtech world. Sleek, AI-powered, and promoted as the next evolution in learning, they promise to transform the classroom. Real-time feedback. Immersive experiences. Personalised accessibility support. But here’s the thing: they also record. They see. They store data. And they’re being quietly rolled out in schools, because anyone can go to the Ray Ban Website, pay $500 and get them delivered to their door. Even if they live across the road from a school. Without a serious national conversation about what’s at stake, there are some critical questions we believe need your attention.
While some may enthusiastically praise these devices and paint a picture of tech-enhanced chemistry labs and accessible support for neurodiversity as exciting. Useful, even. Check whether they mention ethics. (Scrolling through the Ray Ban Website….)
Does it mention rights?
Does it mention harm?
What they do and don’t mention matters
Once a device can record, it can surveil. It can be used to monitor behaviour, capture images without consent, and stream content live to platforms beyond the classroom. In the hands of the wrong user, smart glasses aren’t just learning tools – they’re tools of manipulation, misuse, and control. And remember – anyone can buy smart glasses. This is a very different context than CCTV footage in schools. To explain, and let’s not be naive about who’s influencing young minds right now.
The same students, staff, parents and onlookers that might wearing smart glasses may also be influenced by Andrew Tate on TikTok, YouTube, and Instagram. Imagine this: you’re teaching Maths. A student parrots Andrew Tate’s misogynistic views and live streams your response through the smart glasses they were wearing. You had no idea. Its not like they held up their phone. They were just looking at you, through their glasses. Within minutes, it’s online, weaponised, and fed to the Tate army. They didn’t mean to destroy you or your capacity to feel safe teaching. But they did.
You leave teaching.
Not because you wanted to.
But because the damage was done.
Are you being recorded?
Now, even walking the streets feels uneasy – you are left wondering if the glasses people are wearing are quietly recording you while you buy groceries or cross the road. Other teachers start to wonder. What if parent-teacher interviews had smart glasses? Or the swimming carnival? What if someone is just sitting outside the school with a pair of glasses on while kids are playing? They don’t have their phone out, so their activity doesn’t trigger concern. This isn’t speculation. We’ve already seen how images can quickly be used to make AI-generated deepfake nudes of girls in schools. And teachers aren’t exempt. What smart glasses do is lower the barrier between thought and action.
They offer immediacy. Stealth. Power.
We’ve already seen smart glasses banned from ATAR exams in WA. But what banning them from parent-teacher interviews? PE lessons? Swimming carnivals? Where are the boundaries? And while we are asking questions – who is collecting the data? And where is it all going? How does it align with emerging and current legislation? And all of this is being marketed under the guise of innovation. But innovation without ethical frameworks can be weaponised. Smart glasses do not exist in a vacuum.
They exist in a world shaped by misogyny, online abuse, discrimination and algorithmic amplification of harm. If we ignore that, if we look only at the marketing promises and not at the sociocultural context, we are putting not only students, but teachers, parents, and society at the risk of harm. We need to stop treating “real-time feedback” as neutral. We need to stop pretending “immersive” means safe. And we need to seriously question who benefits from “innovation” when surveillance is embedded in the hardware and marketed by people with millions of followers, like Chris Hemsworth.
Let’s be clear: this is not just about a gadget.
Outside of schools, smart glasses are marketed as sleek, cutting-edge tools designed to enhance everyday life, work, and productivity. In the consumer market, they’re promoted as lifestyle wearables that offer hands-free access to navigation, messaging, music, and AI assistance – all wrapped in fashionable, discreet frames. In education smart glasses are being marketed as inclusive and dynamic. But in practice, they are building out a surveillance infrastructure inside schools. Anyone with smart glasses (students, parents, teachers, the person sitting outside the school) might be able to soon have access to real time facial recognition, eye tracking, emotion analysis, and real-time data sharing. That’s not innovation. That’s infrastructure. Infrastructure that we have legislation around to ensure our rights are being upheld.
Which Brings Us to Chris Hemsworth
You’ve probably seen the ads. Chris Hemsworth, superhero, Aussie icon, father of school-aged children, promoting AI-integrated smart glasses with enthusiasm and charm. He’s partnered with Ray Ban to showcase how wearable AI is the future. But here’s the thing: when a celebrity of his influence endorses surveillance tech, especially without reference to consent or harm, it’s not just a missed opportunity. Its reckless. Now, to be clear – this isn’t about criticising Chris Hemsworth – it’s a call to anyone with the power to shape public perception. If you have the platform, the reach, or the resources, you also have the responsibility to bring potential harms of emergent technologies in education into the conversation. Because ignoring those risks, especially when kids, parents, and teachers are watching, can’t be thought of as naïve. You have a social responsibility to consider if it is reckless.
And it is reckless
Reckless means acting without thinking about the potential consequences – especially when those actions could cause harm. That’s why we need an awareness campaign. Kids look up to Chris Hemsworth. So do parents. So do teachers. If smart glasses are going to be marketed to schools and families, there must be transparency about what they do and what they risk. That’s why we need to have a conversation with Chris. Not about banning the tech. But about being responsible with his platform.
Technology will continue to be marketed aggressively, but those with the power to influence and implement it must take far greater responsibility for its impact. Imagine if Chris Hemsworth read this and considered the perspective of a teacher. In the middle of a teaching crisis, a teacher is trying to deliver a science lesson on a sweltering Friday afternoon in a 35-degree classroom packed with 30 students, only to come home and discover that a slip of the tongue, saying “orgasm” instead of “organism,” has been turned into viral content in the manosphere. One more teacher doesn’t return to the classroom, during a teaching crisis.
Where Do We Go from Here?
We are not saying “ban it all.”
We are saying: Pause. Reflect. Regulate.
We are also not blaming anyone. Because this isn’t about blame. It’s about responsibility. It’s our collective responsibility to adopt new technologies in ways that respect commonly held expectations of technology, especially in semi-private spaces. We already know not to film at swimming carnivals, in toilets, or change rooms and we wouldn’t wear our phones on our faces during a parent-teacher interview – but smart glasses would do exactly that. Imagine if you just ‘forgot to take the glasses off’… Further, teachers and schools shouldn’t be expected to manage the risks of tech like smart glasses alone. Meta, Ray-Ban, and others must embed safeguards, transparency, and safety by design, privacy by design and so on. And those with huge platforms, like Chris Hemsworth could use his platform not just to promote, but to help spark conversations about where this tech belongs.
What if Chris Hemsworth posted to his 52M followers that “Some devices are made for skydiving – not for schools” – would the conversation shift? We would love to hear your thoughts.
Bios

Janine Arantes is a researcher and educator at Victoria University and advocate exploring the social, ethical, and psychological impacts of emerging technologies in education. Andrew Welsman is a researcher and educator at Victoria University with expertise in STEM education, digital technologies, and initial teacher education.
Smart Glasses have a bundle of features. Each feature needs to be addressed separately. In some cases no special action may be needed. As an example the glasses can record audio & video. There are laws and educational institution rules about what you can record when & what you can do with the recordings. That a recording may be made by smart glasses, rather than a smartphone, or a microcassette recorder, is not relevant.
Thank you for your comment. I appreciate your perspective and would like to offer an alternative view that I believe warrants closer consideration.
Today, for as little as $500, it is possible for individuals to livestream, record, translate, and interact with AI, often without the knowledge or consent of those being observed, including children. In light of this, I wonder whether it is sufficient to frame such capabilities as simply “a bundle of features,” comparable to those of a microcassette recorder, with the assumption that existing laws and rules will be appropriately followed.
Might we instead consider a perspective grounded in greater responsibility and awareness, particularly given the potential for harm and the evolving nature of these technologies?