student engagement

Student engagement data: what does it actually mean?

With the many distractions facing students today – laptops, smartphones and social media, just to name a few – it is not surprising that teachers want to keep track of what their students are doing in and outside of class. EdTech companies promise teachers to track their students’ engagement automatically and in real time. They often offer visually appealing dashboard with easy interpretable graphs. But what do these tools actually measure? And what can we read from those dashboards? 

Engagement data, as I call them in a recently published article, can be found in Learning Management Systems (LMS), such as Moodle and Canvas. They can also be found in learning platforms commonly used in primary and secondary education, such as Education Perfect and MathSpace. Even online library Epic! offers engagement data. Engagement data should be seen as any sort of metrics that claim to say something about students’ on-task behaviour, their interaction with a platform or any predictive analysis based on these interactions. Engagement data differs from performance data, which is all about results, however, sometimes engagement data is correlated with performance data to ‘assess’ risk.

There are, roughly, five different types of engagement data. They are time-on-task data, task completion data, participation data, technical data and biometric data.

Time on task

Time-on-task data is based on the time students have spent on the platform. It is a common element in most LMS. Whether or not this is a good indicator of students’ engagement is questionable. Having a browser window or application open without being involved in learning activities, would not be seen by many teachers as a sign of deep engagement. But on the platforms’ dashboards this is presented as such. 

Engagement data on Education Perfect (source: help.educationperfect.com)

What got done

Task completion data seems to be a more straightforward metric. It actually gives teachers an indication of how many tasks their students have completed. However, this depends very much on how task completion is framed. Epic!, for instance, gives an overview of how many books students have ‘read’. However, students could have just clicked through the pages without having actually read something. Another problem with task completion data is the sheer number of tasks a student has completed comes to stand for a student’s level of engagement. This is especially the case on platforms where the number of tasks is potentially unlimited, such as in Education Perfect, where the system automatically generates new tasks based on students’ performances. Following this logic, the most engaged student is the student who has completed the most tasks. 

A screenshot of student engagement data on the library platform Epic

How reading engagement is measured on Epic (source: getepic.com)

Both contribution data and technical data are more common in the world of social media and online forums than in educational contexts. The first measures the number of posts by an individual student. Students discussing the learning content, then, is seen as an additional indicator of their level of engagement, rather than merely completing learning activities. The LMS Brightspace is a particularly interesting case, as it puts learning data in a sociogram, linking the most popular contributors with each other. It is an idea of ‘social learning’ that has more in common with the way in which we are chasing likes on Instagram and TikTok.

Fig. 4

Contribution data on Brightspace/D2L (source: community.d2l.com) 

Are students users?

Technical data are rather peculiar metrics. They merely describe technical interactions with the platform, such as the number of times specific content has been viewed. These data seem to have very little to do with what is generally understood as engagement by educators and scholars. However, in the world of social media these data are very important. They determine the extent to which content can be monetised. This is called user engagement. But are students users? Increasingly so. More education is delivered through commercial platforms profiting from increased use of licences sold to school or individual students.  

Biometric data is not common on the platforms that are currently used in Australian classrooms. But several experimental studies have looked at tracking students’ engagement by measuring brainwaves or by tracking their eye-movement. Influential organisations such as the OECD even promote these techniques to keep students engaged in the digital world. Apparently, it becomes increasingly normalised to monitor students’ bodies. I have been a teacher myself. I do understand teaching involves some surveillance and control. But as a colleagues of mine once said, can we ask from students to be engaged all the time?   And do we always need to know when they are not? 

The biggest problem

Perhaps the biggest problem with all these types of engagement data is that it frames engagement as something that is per definition measurable. Indeed, anything that cannot be measured cannot be put on a dashboard or be used by algorithms. This ‘technological’ idea of engagement excludes other elements of engagement that are put forward by scholars, such as the emotional and more cognitive dimensions. Engagement data do not show if a student enjoys school, for instance. They also do not show if a student is motivated to do another task or if they are resilient enough to deal with setbacks.  

Educational platforms, then, present a very narrow idea of what engagement is about. The question is: does this affect teachers?  Does it change their perception of what engagement is? So far, little research has been conducted on this topic. With this article I hope to create more awareness on the issue. Let us all reflect what kind of engagement we want to see in our students. Let’s critically reflect on the metrics of engagement put forward by digital learning platforms. 

Chris Zomer is a research fellow for the ARC Centre of Excellence for the Digital Child. His research interests include the datafication of learning, gamified learning platforms and the use of technology in education more broadly. For his doctoral thesis, he investigated how gamified learning applications reshape ideas, understandings and enactments of student engagement in a private girl school with an ‘academic’ student population.