To combat high failure and student drop-out rates, universities have developed strategies to monitor online student engagement through measurable activities. We explored if and how these monitoring activities accurately measure online engagement.
Perhaps our most surprising finding was that the teacher-education students in our study did not see their set online tasks as being valuable to their learning. The students complained about being given ‘busy work’ – tasks given to them that appeared to be aimed at just keeping them busy or monitoring simple engagement through a metrics-based tool.
The students reported a number of other activities that did prompt their engagement in learning, but many of these would not be picked up by the usual ways of measuring engagement.
We believe our study and its findings would be particularly useful to teachers at the moment, in any sector, who are creating online learning activities for their students.
Our study
Our research study involved interviewing nine online third-year students (8 female, 1 male) from a four-year teacher-education degree at a regional university in Australia. Each student had been reported as being ‘highly engaged’ by their course coordinator. With their consent, they participated in fortnightly interviews throughout a 13-week semester. The aim was to find out more about what engagement meant for them, how they enacted engagement in the online space, both visibly and invisibly, and the factors that influenced their degree of engagement at different points in time. Interviews were held in the week prior to the start of semester, fortnightly during the semester and within two to three weeks of the semester’s end – eight interviews with each student in all.
We described a ‘highly engaged’ student as someone who consistently and reliably participated in discussion boards or other learning activities, collaborated with other online students, and engaged with the lectures/readings. Reflecting the typical online student profile in general, all were mature-age students, in paid employment family/caring responsibilities.
Simplistic measures perceived by students as not useful to learning
From analysis of the interview data, we found that most students were critical of practices that were clearly designed to measure engagement in simplistic ways. These included
- being required to make a specific number of posts a week
- give feedback to a certain number of other students
- do ungraded online activities such as quizzes that did not add to learning.
While these conscientious participants diligently met these requirements, seven of the nine reported that such mandated posts and activities did not encourage true engagement and deep learning. They were described as being ‘a means to an end’, and ‘busy work’ designed simply to ‘try to make you fill the expected ten hours of study per week.’ The mandating of posts to prompt engagement was described as ‘ridiculous’ and as taking ‘a huge amount of time’, which they believed could have been spent differently to promote deeper learning.
Students experienced profound disappointment and an even greater sense of having wasted their time when their diligently crafted, mandatory posts, received no commentary or replies from either teacher or other students. In addition, such mandatory posting tended to make the online learning platform clogged and overwhelmed with discussion threads that lacked coherence and structure.
Activities reported as being valuable to learning
There were a number of other ways these students reported as engaging them in their studies, which unfortunately, would not be captured by standard systems of measuring online engagement. These included
- engaging in learning with their peers on platforms other than that offered by the university, such as Facebook, Messenger or other social media platforms where they could meet other students and study
- following suggestions by lecturers or other students to do additional, relevant activities such as listen to TED talks, watch a YouTube video, or check out a curriculum resource
- learning activities that prompted their creativity and ultimately contributed to their final assessment task
- lecturers who used a diversity of approaches to learning in the online space
- well-designed, engaging assessment tasks
This study has unearthed some of the complexities that emerge when online engagement is measured in mechanistic ways. It also unveils alternative measures of engagement that might be more meaningful for promoting student learning. As such, this research contributes to a broader conversation about measuring engagement in the online space and can frame the direction for future research, practices, and policy on these matters.
Perhaps there is another way of understanding student engagement, that is not tied up with metrics and monitoring. Engagement for online university students happens in many ways, both visible and more hidden. What if we changed our way of thinking about what engagement is? What if we listened to what students have to say about their own engagement?
We invite educators to move away from having fixed ideas about where and how and when online students should be engaging, and offer a critique of the superficial, descriptive, tick-the-box exercises that are usually designed to monitor engagement by computer rather than through human interaction. We hope educators will take this opportunity, where so many of us are moving to online teaching, to explore other ways of understanding student engagement in the online space.
For those who want more Beyond busy work: rethinking the measurement of online student engagement
Cathy Stone, DSW (Research), is a Conjoint Associate Professor in Social Work at the University of Newcastle. Cathy is an Adjunct Fellow with the National Centre for Student Equity in Higher Education, where she undertook research into improving outcomes in online learning as an inaugural 2016 Equity Fellow. Cathy is currently an Independent Consultant and Researcher on the support, engagement and success of diverse student cohorts in higher education. She can be contacted for any questions or further discussion at cathy.stone@newcastle.edu.au Cathy is on Twitter @copacathy
Naomi Milthorpe is Senior Lecturer in English in the School of Humanities at the University of Tasmania. Her research interests centre on modernist, interwar and mid-century British literary culture. Naomi is the author of Evelyn Waugh’s Satire: Texts and Contexts (Fairleigh Dickinson University Press, 2016) and the editor of The Poetics and Politics of Gardening in Hard Times (Lexington, 2019). Naomi is on Twitter @drmilthorpe
Dr. Janet Dyment is the Director of the School of Education at Acadia University in Nova Scotia, Canada. Prior to her move to Acadia, she spent 20 years at the University of Tasmania in the Faculty of Education. Janet’s research interests include online teacher education, student engagement, environmental education and education for sustainability. With the recent COVID pandemic, Janet is leading her new teacher education team to reimagine on-campus offerings as remote delivery options and encouraging her staff to ensure student engagement remains high in these new modes of deliveries.
As a former online student for seven years, I don’t find is surprising students considered set online tasks ‘busy work’. It was frustrating to be told to do stuff for no apparent reason. So what I now do with course design is explain to the student why they are being asked to do a task, and how it will contribute to their learning (and their grade). The way I do this is with top down design of the assessment, linked to tasks for the students, so it is clear to the student at each step how the learning task relates to the skills and knowledge they must to be able to demonstrate in the assessment.
Requiring students to use tools outside the institution’s learning system creates a moral hazard for teachers. By doing so they place students at risk. The approach I suggest is to offer tools within the environment, which students can use if they wish.
Thanks Tom for adding this comment. Receiving this type of clear explanation about how tasks were relevant to the learning outcomes was exactly the sort of thing that the students in this study appreciated! And yes, good point about offering what they need, in terms of technology tools, within the learning environment. Apart from the possible risks you mention, it is so much easier and less time consuming for them if they don’t have to go looking. . Our students in the study appreciated suggestions about extra things they could do if they had time, but not compulsory.
This was really interesting, thank you. Just pointing out a central contradiction here: students are upset when no one responds to their posts, yet regard having to respond to a certain number of other students’ posts as a waste of time? Peer learning can’t be all take and no give.
Did you call your participants out on this?
As a former online student myself, I am always interested, now as a lecturer, to see how some students engage generously, meaningfully and enthusiastically with others’ posts, way beyond minimum requirements, while some don’t bother at all. Maybe, as with many things in life, you get out what you put in. I’ll look forward to discussing your post with my students- great for debate, thank you!
Thanks for your comment, Lucinda! We didn’t query our students on this point -we were trying to get a sense of what did or didn’t engage them, as a first step towards a more holistic picture of online learning. I think the point you’ve raised here is definitely important, but less of a contradiction than a link: they saw posting/responding as arbitrary in part because there seemed to be no response from their lecturers, and in part because it seemed more mechanistic (x number of posts required) than truly meaningful.
You’re right, that you get out what you put in – motivation & self-direction is so important in online learning, and it’s really bolstered when teachers make those opportunities for interaction online (w/ peers and materials) really dynamic & engaging. What the students in our study underscored was how much effort they were putting in, even though sometimes that effort wasn’t visible in the online system!
We’d be keen to hear what your students make of our study – thanks for taking it to them.