Largely missing from the ban-phones-in-schools debate are the opinions of important regulatory bodies such as the Australian Competition and Consumer Commission (ACCC) and the Australian Human Rights Commission (AHRC). Although these bodies may seem far removed from the debate, I believe how they view data and data collection should be heard.
Significantly, their views may support a ban of smart phones in schools. Not because of the disruption phones might cause in classrooms, but because of the extensive amounts of data being collected about children and young people as they go about their school day.
Advances in technology, such as Artificial Intelligence (AI), are partially enabled by access to data collected from smart phones. Smart phones are a data collection device. On a global scale, people are largely unaware of the size and extent to which their data is being collected on such devices and subsequently used.
There are associated harms with this data collection and use that have yet to receive sufficient public debate.
The ACCC has recently released their final Digital Platforms Inquiry, which raises important points that I believe should have led the recent debate when Victorian education minister, James Merlino announced a ban of smart phones in Victorian schools.
This report, coupled with a major project on human rights and technology by the Australian Human Rights Commission and the Artificial Intelligence: Australia’s Ethics Framework (A Discussion Paper) by the CSIRO, provide a collective warning about the vast amounts of personal data being collected and its implications.
The ban of phones on school will effectively limit the data collected on children by not allowing them to use their personal devices during school hours. I see this as a much more important outcome, a need for schools.
Informed consent
The Consumer Policy Research Centre recently published an issues paper highlighting that current technology and technology being developed “can infer everything from personality, health status, and political affiliations, through to even our mood.” (p 16)
Children and young people are unable to give informed consent regarding their data and how it is used. We have recently seen this phenomenon discussed in relation to a viral ‘game’ where people can ‘age’ their photos, through FaceApp. The app’s terms of use give its Russian parent company, Wireless Lab, “a very broad, global and lifelong licence to use the images” it collects from users.
The ACCC suggests that “we are at a critical point in considering the impact of digital platforms on society” and the Australian Human Rights Commission (AHRC) states that “new technologies are already radically disrupting our social, governmental and economic systems”.
Parents are often dependent on the professional opinion of the school. The school often refers back to state and federal guidelines. Guidelines are consistently challenged to keep pace with changes in technology. How can young people be expected to give informed consent?
Therefore, any physical barrier to limiting the collection of children and young people’s data at school should be welcomed. It is much easier to police a physical item, than the data it collects, or trying to trace how that data may be used.
The potential for harm is almost impossible to understand
There has been significant research into the harms associated with data. Many people critically investigating digital and commercial platforms, are now calling for greater ethical debate surrounding commercialization, data use and the use of such technologies. However, understanding the link between data and harm is almost impossible. As Professor in Law and specialist in information privacy at the University of Colorado, Paul Ohm states
We are embarking on the age of the impossible-to-understand reason, when marketers will know which style of shoe to advertise to us online based on the type of fruit we most often eat for breakfast, or when the police know which group in a public park is most likely to do mischief based on the way they do their hair or how far from one another they walk.
Data collected from smart phones can be aggregated and used in other contexts. Harms have been shown to be associated with politics and data driven marketing campaigns, predictive policing and decisions made in the judiciary. Such topics have also been mentioned by the Australian Council of Learned Academies (ACOLA) in their report on The Effective and Ethical development of Artificial Intelligence.
Ignorance is no excuse
It could be argued that schools cannot be expected to know the harms associated with data collection via smart phones, as society at large is unaware.
The Artificial Intelligence: Australia’s Ethics Framework (A Discussion Paper) states
Australians are largely unaware of the scale and degree to which their data is being collected, sold, shared, collated and used.
However, someone may well have to take responsibility in the future for allowing such vast amounts of personal data to be collected and arguably monetized during school hours. Should that happen, a lack of awareness or understanding may not be a valid excuse.
There is an increasing coverage of the potential for harm through various forms of media. We have already had such discussions right here on the AARE blog with Vast amounts of data about our children are being harvested and stored via apps used by schools, and just recently Education shaped by big data and Silicon Valley. Is this what we want for Australia? Ignorance is no excuse.
This idea is supported by the CSIRO in its paper titled, Artificial Intelligence: Australia’s Ethics Framework (A Discussion Paper). It clearly states that ignorance is “unlikely” to be accepted as a defense
Know the trade-offs in the system you are using. Make active choices about them that could be justified in the court of public opinion. If a poorly-designed AI system causes harm to the public, ignorance is unlikely to be an acceptable defense.
Banning phones does not mean removing technology
I am not arguing for the removal of technology from schools. What I am arguing for is the removal of personal devices where other personal and social harms will not occur as a result of their removal. I stress that the individual context must be considered.
By evaluating risks and minimizing harms holistically, schools could limit the use of personal data collection devices (smart phones) during school hours, without discounting individual needs.
What does the ACCC say about data collection?
Teachers can help young people navigate digital platforms and I support calls for increased data infrastructure literacies, digital literacies and media literacies in K-12 settings. But navigating the data once it is de identified and aggregated for other contexts, is like trying to teach young people how to navigate an unknown space, with unknown tools for unknown outcomes.
This may be why the ACCC recommends significant amendments to the Privacy Act.
The ACCC produced the Digital Platform report to explore potentially adverse implications, including the impact of platforms on consumers in relation to their information.
Their findings show that Google and Facebook utilize a business model that is dependent on consumer engagement with the platform. Their model collects data in order to sell advertising opportunities. Increased engagement, means increased data. The more data it collects the more revenue it can generate.
But data is not only used for advertising. It is also used to develop other apps and platforms, and further advance developments in AI.
The report highlights that consumers are unable to make informed choices regarding the amount of data that digital platforms collect and how the collected data is used. It also highlights that consumers cannot readily opt out of targeted advertising.
This is occurring at a time, when data regarding how people learn is increasing in commercial value. The global ‘Artificial Intelligence in Education’ market is expected to grow by 40% in the next 5 years. The Asia Pacific region is expected to experience the largest growth.
Commercial innovation in AI, should not come at the expense of informed consent in education.
The ACCC recommends that the Privacy Act provide students (and all consumers) with greater control over their personal information, by calling for greater “protection against misuse of data and empowering consumers to make informed choices” (p. 35).
Strengthen consent requirements
Only half of the top 50 apps used in Australian Primary Schools in September 2017 highlighted compliance with various approaches to consent. The ACCC suggests failure to get informed consent, needs to be mandated within the Privacy Act. They recommend that data collection is pre-selected to ‘off’ and that the individual must ‘opt in’ should they wish to have their data collected. However, teachers cannot be expected to police personal smart phones.
Part of the larger picture
When considering the ACCC report, schools should also be aware of the Consumer Data Right Bill that is currently making its way through the Australian Parliament. This bill, if it passes, will provide individuals with the right to access data relating to them, as well as to authorize access to their data. The bill will pave the way for Australia’s future data economy.
What schools and school leaders could do
With schools becoming more and more like businesses in Australia, the opinion of regulatory bodies such as the ACCC should matter, as should the opinions of the AHRC and the CSIRO.
I believe schools and school leadership could, as a priority, develop data stewardship strategies and awarenesss campaigns while the associated legislation and policies are being developed. The task of protecting children from harms associated with data collection is challenging.
Banning phones in public schools, at least in part, makes this task easier.
Janine Aldous Arantes is a PhD student at the University of Newcastle. She is researching how Australian K-12 teachers are negotiating apps and platforms as part of the educational practice. She can be contacted via janine.arantes@uon.edu.au or found on Twitter: @Aldous2018.
Janine uses an avatar as her headshot because when talking about data collection and concerns about the use of metadata, Janine wanted to highlight the normalization and ubiquity of data collection. Using an avatar demonstrates how you can have some control over your data online.
If students, or their parents, are concerned about smartphones being an invasion of privacy, they could take measures against this, or stop using the phones. It is not for schools to impose this requirements on the students, or their parents, unless this adversely impacting education..
Rather than wasting their time trying to stop phones in schools, teachers could sponsor research to minimize the harmful effects and maximize the benefits. As an example, a school mode could be developed which would switch off distractions in the phones, a suspend tracking.
University students now learn more outside the classroom, aided by their communication devices, than they do in class. This trend will trickle down to schools. Teachers can either learn how to incorporate this approach in their teaching, or be marginalized by the technology.
Thanks Tom, this topic certainly has multiple perspectives. It’s a great idea to suspend tracking during school hours – I’d love you to expand on that. What would this look like and how could student’s be prevented from gaming the ‘mode’?
This blog shifts the perspective away from the ‘individual student / parent / teacher being accountable for data’ debate. It presents what Australia’s consumer watchdog has recommended. The ‘trickle down’ to schools has already occurred and there are many approaches such as digital literacies, managing digital footprints and so on that are well established. The ACCC perspective provides an alternative view to the current discourse for schools to consider. For example, they suggest that the potential future definition of ‘personal information’ may include IP addresses and device identifiers. Therefore, school managed devices vs personal smart phones may pose slightly different risks when considering the ACCC recommendations.
Given that they recommended some pretty big changes to consumer protection laws, privacy laws and unfair trading practices, if schools know this, they have an opportunity to join the current discussion regarding the myriad of commercial apps in the K-12 space. That is, the Australian government acknowledges that reforms are needed and there is a 12-week consultation period occurring now.
This blog presents an opportunity for teachers / schools to have their voices heard regarding data collection, use etc – as long as schools know about it…
Janine, it should be possible to design an app which would switch the student’s phone into “study mode” when on campus. This would block most tracking and communication (except with parents) and limit the use of non-educational apps.
If you wanted such an app you could get the computing students at the nearest tertiary institution to build it. I teach teams of students doing such projects in the ANU TechLauncher program.
The ANU TechLauncher looks great. Thank you.
One challenge I could see with blocking / limiting the use of non-educational apps is the definition of ‘educational ‘ and ‘non-educational’. Likewise, the term ‘communication’. For example, there is a social learning app called Edmodo that once referred to itself as the Facebook for Education. It fundamentally provides a communication platform in K-12. It’s a free commercial app used by millions of teachers / students around the world and the data collected on the platform, appears to be provided to others through a ‘partners and developers program’.
So, is Edmodo ‘educational’? Yes.
Would it function if the data collection was blocked and communication on it ceased? Probably not.
What would be an interesting app, would be something that could capture past data to visualize where data is likely to go. For example, when a teacher considers trialing an educational app, they are asked the standard ‘Have you read the privacy / cookies policy?’ What if they were also provided a visualization that infers how their data may be used, based on the past data collected on other teacher engagement with various apps? Do you think this might be possible or is tracking how de-identified data is being used quite challenging?
I’d be really keen to see something like that myself. I’m off now to look more into the Techlauncher!
Janine, it would be necessary to balance the educational value of a particular app, against the privacy and other risks in deciding if students can use it. If none of the available applications are suitable, you could have a team of university students write you a new one. Keep in mind that many universities which teach computing have teams of students designing apps, not just ANU. Universities also run entrepreneurial programs which help design such products. One example of an educational application developed with the held of students is “OKRDY”, a mentorship and skilled-volunteer matching platform. I mentored the team of students who won the Innovation ACT competition with OKRDY.
Thanks again Tom. Agreed, to balance the educational value of a particular app, against the data collection, use and sharing requires teachers knowing the multiple perspectives out there. OKRDY looks like a great innovation to encourage cross collaborations.
Hi Janine,
thanks for the article, I was hoping to see something on this from you. You bring a valued perspective to an important and complex issue for contemporary education.
I suspect schools are generally playing catch-up (from a long-way back) when it comes to policy and practice in this space. However, as you argue, it is important that education systems and the schools within them think carefully about the potential harms that come with using digital platforms and other data hungry products.
Hi Jamie,
Thanks for the feedback. Hopefully the timing of this blog provides practitioners a space to be involved in the development of policy and practice in this space.
If we take this idea and run with it a number of problematic issues arise. In poor schools the funding is dedicated to minimising disruption to learning by increasing the number of adults in classrooms. This means technologies get neglected. In wealthier schools there is an opportunity to BYOD and these devices connect to a filtered wifi during school hours. Great, except that poorer students do not have the capacity to buy their own tablet/laptop devices for school. Quite often the ways these students connect is through cheaper personal devices. This is particularly true in Aboriginal Communities. Schools that ban phone use typically ask students to hand their phones in at the front office or to teachers, making the school responsible if there is any theft. Even when phones are locked away for the day students typically bypass this by having the phone they hand in and another which they typically use in secret. When teachers notice phone use they are then immediately set at odds with students and parents because young people feel their rights are infringed by not being able to use their devices. The only response is confiscation, making the relationship combative and win/lose. Even if sites state not to bring their devices to school at all – this still needs enforcing and policing. And what of students with long bus rides home whose parents rely on their phones to be able to track and make sure they are safe. Have you ever left your phone at home? How disabled do you feel without it? In an age of big data and ubiquity that is moving faster than the typically behemoth rates of school and government, students are left in a limbo where they rely on their devices for personal safety and to fill the funding gaps in sites that do not have the community capacity to create access to reliable technology. A ban may seem logical but it merely transfers the problem elsewhere and creates a new set of other more complicated ones. We need smart technical solutions to this complex issue.
Thanks Karen, I appreciate your feedback here. The discrepancies between hardware in schools, student-teacher relationships and safety in relation to the phone ban have (rightly so) received significant debate. The aim of the blog was to highlight other perspectives that had not received as much, if any, attention. That is, there are both ‘front end’ and ‘back end’ issues to be considered, and the ‘back end’ was largely not being discussed. Your points are both valid and significant, and I thank you for reminding people of the other perspectives to be considered. I hope our combined perspectives highlight the complexity of personal devices in schools, from both the front and back end.
Does the absence create an unintended risk of no teaching of self-agency? I am thinking about wearables and other ‘smart’ tech including wearables. Is the data trail already significant outside of a school use context so that a ban would have minimal benefit? Inside, outside school data trails would be highly variable? Is data stewardship strategies the responsibility of educators? What would they base this on? Schools have complete visibility into what students do in class or at home on school devices: web-browsing history, all files, downloads, research notes, and emails, with timestamps and often location attached to every activity. I am glad you wrote this article because it is going to require significant, diverse dialogue to consider the risks, benefits and bias associated with this area. I wonder what young people would think about data collection and what they would consider no-go zones? Many are savvy in their choices and many more are not. It raises the issue of ethics on behalf of tech giants and start-ups, legislation as
Great question PKCC1, re: Does the absence create an unintended risk of no teaching of self-agency? I would think that by banning the phones, it encourages greater discussion about the ubiquitous nature of data collection. Ie why would we want to ban phones when there are so many conflicting opinions? So, yes, as the data trails from wearables and other ‘smart’ tech is already significant outside of a school use context, a ban may illuminate any obfuscation that is occurring in the normalization of data collection across multiple personal devices within school. In regards to data stewardship strategies for students, I think the various state departments are all working on this as you have highlighted. Data stewardship however, would also include teacher data. Teachers are trialing many apps and platforms before they decide on the ‘right one’ for their personal devices. They are also receiving advertising on their personal devices, arguably based on their previous searches for work purposes. I’m sure teachers don’t want schools to have complete visibility into what they are doing, but something in regards to guidelines to when they need to clear cookies and document what apps they have trialed and the time spent trialing apps would be worth considering in any future discussions.
Thanks for the feedback, I agree multiple perspectives are needed here.