Commercialisation of private data collected in schools

Ban smart phones in schools. Not because they’re disruptive but because of this

Largely missing from the ban-phones-in-schools debate are the opinions of important regulatory bodies such as the Australian Competition and Consumer Commission (ACCC) and the Australian Human Rights Commission (AHRC). Although these bodies may seem far removed from the debate, I believe how they view data and data collection should be heard.

Significantly, their views may support a ban of smart phones in schools. Not because of the disruption phones might cause in classrooms, but because of the extensive amounts of data being collected about children and young people as they go about their school day.

Advances in technology, such as Artificial Intelligence (AI), are partially enabled by access to data collected from smart phones. Smart phones are a data collection device. On a global scale, people are largely unaware of the size and extent to which their data is being collected on such devices and subsequently used.

There are associated harms with this data collection and use that have yet to receive sufficient public debate.

The ACCC has recently released their final Digital Platforms Inquiry, which raises important points that I believe should have led the recent debate when Victorian education minister, James Merlino announced a ban of smart phones in Victorian schools.

This report, coupled with a major project on human rights and technology by the Australian Human Rights Commission and the Artificial Intelligence: Australia’s Ethics Framework (A Discussion Paper) by the CSIRO, provide a collective warning about the vast amounts of personal data being collected and its implications.

The ban of phones on school will effectively limit the data collected on children by not allowing them to use their personal devices during school hours.  I see this as a much more important outcome, a need for schools. 

Informed consent

The Consumer Policy Research Centre recently published an issues paper highlighting that current technology and technology being developed “can infer everything from personality, health status, and political affiliations, through to even our mood.” (p 16)

Children and young people are unable to give informed consent regarding their data and how it is used. We have recently seen this phenomenon discussed in relation to a viral ‘game’ where people can ‘age’ their photos, through FaceApp. The app’s terms of use give its Russian parent company, Wireless Lab, “a very broad, global and lifelong licence to use the images” it collects from users.

The ACCC suggests that “we are at a critical point in considering the impact of digital platforms on society” and the Australian Human Rights Commission (AHRC) states that “new technologies are already radically disrupting our social, governmental and economic systems”.

Parents are often dependent on the professional opinion of the school. The school often refers back to state and federal guidelines. Guidelines are consistently challenged to keep pace with changes in technology. How can young people be expected to give informed consent?

Therefore, any physical barrier to limiting the collection of children and young people’s data at school should be welcomed. It is much easier to police a physical item, than the data it collects, or trying to trace how that data may be used.

The potential for harm is almost impossible to understand

There has been significant research into the harms associated with data. Many people critically investigating digital and commercial platforms, are now calling for greater ethical debate surrounding commercialization, data use and the use of such technologies. However, understanding the link between data and harm is almost impossible. As Professor in Law and specialist in information privacy at the University of Colorado, Paul Ohm states

We are embarking on the age of the impossible-to-understand reason, when marketers will know which style of shoe to advertise to us online based on the type of fruit we most often eat for breakfast, or when the police know which group in a public park is most likely to do mischief based on the way they do their hair or how far from one another they walk.

Data collected from smart phones can be aggregated and used in other contexts. Harms have been shown to be associated with politics and data driven marketing campaigns, predictive policing and decisions made in the judiciary. Such topics have also been mentioned by the Australian Council of Learned Academies (ACOLA) in their report on The Effective and Ethical development of Artificial Intelligence.

Ignorance is no excuse

It could be argued that schools cannot be expected to know the harms associated with data collection via smart phones, as society at large is unaware.

The Artificial Intelligence: Australia’s Ethics Framework (A Discussion Paper) states

Australians are largely unaware of the scale and degree to which their data is being collected, sold, shared, collated and used.

However, someone may well have to take responsibility in the future for allowing such vast amounts of personal data to be collected and arguably monetized during school hours. Should that happen, a lack of awareness or understanding may not be a valid excuse.

There is an increasing coverage of the potential for harm through various forms of media. We have already had such discussions right here on the AARE blog with Vast amounts of data about our children are being harvested and stored via apps used by schools, and just recently Education shaped by big data and Silicon Valley. Is this what we want for Australia? Ignorance is no excuse.

This idea is supported by the CSIRO in its paper titled, Artificial Intelligence: Australia’s Ethics Framework (A Discussion Paper). It clearly states that ignorance is “unlikely” to be accepted as a defense

Know the trade-offs in the system you are using. Make active choices about them that could be justified in the court of public opinion. If a poorly-designed AI system causes harm to the public, ignorance is unlikely to be an acceptable defense.

Banning phones does not mean removing technology

I am not arguing for the removal of technology from schools. What I am arguing for is the removal of personal devices where other personal and social harms will not occur as a result of their removal. I stress that the individual context must be considered.

By evaluating risks and minimizing harms holistically, schools could limit the use of personal data collection devices (smart phones) during school hours, without discounting individual needs.

What does the ACCC say about data collection?

Teachers can help young people navigate digital platforms and I support calls for increased data infrastructure literacies, digital literacies and media literacies in K-12 settings. But navigating the data once it is de identified and aggregated for other contexts, is like trying to teach young people how to navigate an unknown space, with unknown tools for unknown outcomes.

This may be why the ACCC recommends significant amendments to the Privacy Act.

The ACCC produced the Digital Platform report to explore potentially adverse implications, including the impact of platforms on consumers in relation to their information.

Their findings show that Google and Facebook utilize a business model that is dependent on consumer engagement with the platform. Their model collects data in order to sell advertising opportunities. Increased engagement, means increased data. The more data it collects the more revenue it can generate.

But data is not only used for advertising. It is also used to develop other apps and platforms, and further advance developments in AI.

The report highlights that consumers are unable to make informed choices regarding the amount of data that digital platforms collect and how the collected data is used. It also highlights that consumers cannot readily opt out of targeted advertising.

This is occurring at a time, when data regarding how people learn is increasing in commercial value. The global ‘Artificial Intelligence in Education’ market is expected to grow by 40% in the next 5 years. The Asia Pacific region is expected to experience the largest growth.

Commercial innovation in AI, should not come at the expense of informed consent in education.

The ACCC recommends that the Privacy Act provide students (and all consumers) with greater control over their personal information, by calling for greater “protection against misuse of data and empowering consumers to make informed choices” (p. 35). 

Strengthen consent requirements

Only half of the top 50 apps used in Australian Primary Schools in September 2017 highlighted compliance with various approaches to consent. The ACCC suggests failure to get informed consent, needs to be mandated within the Privacy Act. They recommend that data collection is pre-selected to ‘off’ and that the individual must ‘opt in’ should they wish to have their data collected. However, teachers cannot be expected to police personal smart phones.

Part of the larger picture

When considering the ACCC report, schools should also be aware of the Consumer Data Right Bill that is currently making its way through the Australian Parliament. This bill, if it passes, will provide individuals with the right to access data relating to them, as well as to authorize access to their data. The bill will pave the way for Australia’s future data economy.

What schools and school leaders could do

With schools becoming more and more like businesses in Australia, the opinion of regulatory bodies such as the ACCC should matter, as should the opinions of the AHRC and the CSIRO.

I believe schools and school leadership could, as a priority, develop data stewardship strategies and awarenesss campaigns while the associated legislation and policies are being developed. The task of protecting children from harms associated with data collection is challenging.

Banning phones in public schools, at least in part, makes this task easier. 

Janine Aldous Arantes is a PhD student at the University of Newcastle. She is researching how Australian K-12 teachers are negotiating apps and platforms as part of the educational practice. She can be contacted via or found on Twitter: @Aldous2018.

Janine uses an avatar as her headshot because when talking about data collection and concerns about the use of metadata, Janine wanted to highlight the normalization and ubiquity of data collection. Using an avatar demonstrates how you can have some control over your data online.

Vast amounts of data about our children are being harvested and stored via apps used by schools

Electronic data is increasingly being collected in our schools without people being fully aware of what is happening.

We should be concerned about the amount of data being collected via apps and commercial software used by schools and teachers for varying reasons. We need to ask questions such as:

  • How is that data being stored and used?
  • How might the data be used in the future, particularly sensitive data about the behaviour of children?

We also need to ask about data being collected on teachers and schools.

  • Is the collection of data on individual students in fact allowing data to be collected on teachers and schools?
  • How might that electronic collation of data be used in the future?

Potential misuse and consequences for children

Recent times have brought issues about data and privacy to the public eye. A number of ‘data controversies’, including breaches from global giants like Facebook, Google and Amazon, as well as a security slipup from the huge education platform Schoolzilla, that exposed test scores of up to 1.3 million students. These issues reveal the risks of collecting human data and its potential misuse by the companies that store and use it.

A recent report published by the UK Children’s Commissioner also highlights the potential consequences for children. It reported that,

‘we do not fully understand yet what all the implications of this is going to be when they are adults. Sensitive information about a child could find its way into their data profile and used to make highly significant decisions about them, e.g. whether they are offered a job, insurance or credit’.

Many companies already use psychological profiling data to make decisions about who they employ. In the future they might find it valuable to view a behaviour profile developed through schooling to help assess an employee’s suitability.

An example: ClassDojo is accumulating sensitive data profiles on students, teachers and schools

ClassDojo is an extremely popular classroom management app designed to help teachers with school discipline and communication. What isn’t clear to many is its voracious appetite for student data or what happens to that data. Also, it’s not clear that data on teachers and schools is being collected.

New research examining ClassDojo is raising concerns about how student data about behaviour may be collected, accumulated and then used.

Much like the traditional behaviour chart ClassDojo is designed to give feedback to students about their behaviour. Students are awarded positive and negative points to reinforce or discourage particular pre-selected behaviours.

However, unlike traditional behaviour charts, ClassDojo creates a long-lasting record of the data it collects. With the ease of generating a behavioural report with the click of a button, it makes creating a permanent electronic or printed behaviour record simple for busy teachers.

As teachers monitor student behaviour by keeping electronic records, they are also creating a data set on their own behaviour over time. Collectively such student and teacher data records could be compiled for a school.

What data does ClassDojo collect?

Student behaviour in the classroom

The data gathered by ClassDojo to shape student behaviour includes:

  • behaviour performed (default behaviours are psychological character traits i.e. grit)
  • how many times a particular behaviour has been performed
  • the date when the behaviour feedback was awarded
  • the point value that comes with the behaviour
  • who gave the feedback
  • how many ‘positive’ points a student has
  • how many ‘needs work’ points a student has, and
  • a calculated percentage score representing the per cent of positive points compared to total points received.

All this data is compiled and analysed to create behaviour reports about individual students and the whole class. Reports contain red and green colour coded donut charts showing a comparison between the ‘positive’ versus ‘needs work’ behaviours. They also provide numbered statistics based on the data mentioned above, the main one being a percentage score referred to in the above list designed to represent the behaviour quality of a student or class.

The big problem with ClassDojo reports on students

A major problem with creating reports like this is that they only judge students on a small number of behaviours that ‘count’. They ignore, and even deter, diversity. For example, teachers have to identify behaviours they want students to exhibit so they can monitor them using ClassDojo. Default options include working hard, on-task, and displaying grit. This list has to be limited to a number of behaviours that is manageable by the teacher to track. The selected behaviours end up being the ones that count, others are ignored, thus promoting conformity.

Resembling a psychometric report, there is a concern that these ClassDojo reports may be collected by schools to create student behaviour profiles that follow students throughout their schooling.

Such reports could be used to make highly significant decisions about students, e.g. whether their ‘character’ profile is suitable for leadership roles, or whether they should take certain subjects.

Ultimately there is the potential that profiling in this way could influence decisions that limit or enhance future educational opportunities. We know from decades of research on the power of teacher expectations that this is an important consideration.

The vast amount of data collected by the company is a concern for all caught in the net

ClassDojo also collects a vast amount of personal data about its users including students, teachers, parents and school leaders. This data includes

  • First and last names
  • Student usernames
  • Passwords
  • Students’ age
  • School names
  • School addresses
  • Photographs, videos, documents, drawings, or audio files
  • Student class attendance data
  • Feedback points
  • IP addresses
  • Browser details
  • Clicks
  • Referring URL’s
  • Time spent on site
  • Page views
  • Teacher parent messages

Moreover, ClassDoJo says it ‘may also obtain information, including personal information, from third-party sources to update or supplement the information you provided or we collected automatically’.

The ClassDojo messaging function

ClassDojo’s also has a messaging function.  The company describes its ClassDojo’s messaging function as a ‘safe way for a teacher and a parent to privately communicate’ but this messaging function raises further concerns for us about data privacy and profiling. ClassDojo Messaging enables teachers to send text, photos, stickers, or voice notes to parents who can respond using text.

To add to our concerns over the messaging function is ClassDojo states ‘The content of all messages (including photos, stickers and voice notes) are stored. [and] … cannot be deleted by either the teacher or the parent.’

It remains unclear just how private such communication really is. While ClassDojo says it does not read these messages, it declares that school ‘district administrators can request [access to] messaging histories (plus Class/School/Student Story posts) by emailing [the company].

How safe is all of this?

So where does all this data collected by ClassDojo go?

Two of the third party service providers involved are Amazon Web Services and MLab. They are companies used by ClassDojo to store data about its users. Amazon Web Services has a less than ideal record of keeping data stored on its servers secure. Data breaches within Amazon Web Services have exposed sensitive information about thousands of GoDaddy and Accenture customers.

Because ClassDojo stores the data it collects outside of Australia, it is not subject to Australian Privacy Law. A point of difference being that US law states that companies can be forced to hand over hosted data to the government, and to do so secretly.

It’s time to take stock of the electronic data that is being collected in schools

So whilst apps like ClassDojo might be easy to use and friendly, schools need to carefully consider the potential consequences.

Too much sensitive data is being collected about our students and we need to stop and critically reflect on what is happening in schools.

We also need to be aware that by collecting data on students we are also creating data sets on teachers and schools. We do not know how such data sets could be used in the future.

For those interested in our research:  Jamie Manolev, Anna Sullivan & Roger Slee (2019) The datafication of discipline: ClassDojo, surveillance and a performative classroom culture, Learning, Media and Technology

Jamie Manolev currently studies and works at the School of Education, University of South Australia. Jamie does research in School Discipline, Digital Technologies and Primary Education. His current PhD research is investigating ClassDojo as a school discipline system. Jamie also works on the ‘School Exclusions Study’ and as a Research Assistant on the ARC Linkage funded ‘Refugee Student Resilience Study’.

Dr Anna Sullivan is an Associate Professor of Education at the University of South Australia. A/Professor Anna Sullivan is a leading expert in the fields of teachers’ work and school discipline. She is committed to investigating ways in which schools can be better places. She has extensive teaching experience having taught in Australia and England and across all levels of schooling. A/Professor Sullivan has been a chief investigator on numerous Australian Research Council Linkage grants.

Roger Slee is Professor of Inclusive Education at the University of South Australia. He is the former Deputy Director-General of Queensland Department of Education, Founding Editor of the International Journal of Inclusive Education and Journal of Disability Studies in Education, and held the Chair of Inclusive Education at the Institute of Education University of London.