QUT

Decisions, decisions: Why do teachers feel time poor?

The first school term for 2025 is ramping up, and many teachers are returning to complex and tiring – if extremely important and fulfilling work. A key part of this work is making decisions: from long-term, considered decisions, to those which occur ‘in the moment’, consciously or subconsciously during classroom interactions. Indeed, there’s a common understanding that teachers make a lot of decisions. In the 1960s, sociologist Philip W. Jackson estimated the number to be about 1500 in a single day.

But while Jackson was interested in documenting ‘life in classrooms’, he was not really focused on the question of how decision making is experienced by teachers, or how it might be a factor in understanding concerning recent reports of work overload and intensification. Indeed, most scholarly work on teacher decision making so far has positioned it as part of what makes teaching effective; as something that changes over time with growth in professional knowledge; and/or as a resource – a source of professional control and autonomy.

In our research, we sought to ask the question of whether decision making might be part of the subjective intensity of teaching work. To do this we used an app developed for the Teachers and Time Poverty project. The app asked teachers to report on the number of decisions made within a time-sampled 30-minute period, and the stakes and time pressure associated with these decisions. In a recent chapter for a book two of us edited on time poverty, we present these decision making data from a trial of the app with 138 teachers reporting on 280 30-minute timeslots.

How many decisions?

In our trial, most teacher respondents (189/68%) estimated that they had made 30 decisions or less within their assigned 30-minute period (with the most common response being 11-20 decisions, and the average being 21-30). This is somewhat low, if we consider Jackson’s estimation of 1500 a day, which would equate to at least 130 decisions in half an hour. This result may be because decisions that become automatic are harder to recall, and/or because stressful or complex situations may make it harder to recall the process of making a decision. Importantly, Jackson was observing teachers – not doing the teaching himself and trying to self-report his decision making.

How pressured were these decisions?

Questions about pressure to make decisions quickly, or make high stakes decisions, were measured using a scale from 1 to 7, where 1 was ‘not at all’ and 7 was ‘to a great extent’. In terms of pressure to make decisions quickly, most responses ranged from 4-7 out of 7. Leaders reported more pressure (83% in the range of 4-7) than teachers (71%).

In terms of pressure to make high-stakes decisions, responses were more evenly distributed. Leaders tended to report greater pressure here (67% in the range of 4-7), compared to teachers (48%).

These findings around decision making pressure suggest that it’s not just the number of decisions, but the nature of those decisions that contribute to the teachers’ and school leaders’ experiences of working time.

Do teachers have enough time?

A further question we consider in our chapter is whether participants felt they had enough time to complete everything they intended to do within the 30 minutes they were reporting upon. Responses from over half the group (58%) tended toward ‘not at all’, with 19% selecting 3, 23% selecting 2, and 16% selecting 1 out of 7.

Is this unusual?

We also asked teachers how typical their day was overall. The majority of responses confirmed that theirs had been more or less a typical day, with a median response of 5 on the 7-point scale. This indicates that not having enough time to do all they need to do, and needing to make decisions quickly – some of which are high-stakes – is a commonplace experience in teaching. Teachers also reported undertaking a very wide range of activities during their allocated 30-minute time slots, including face-to-face teaching, preparation and administration, student wellbeing responsibilities, and other activities outside the classroom – and often more than one of these categories within the same 30-minute block. We wonder if this ‘typical’ kind of variability, including as it relates to decision making, may be a further dimension of the intensity of teachers’ working time.

Decision making and time poverty

Our work sees decision making not in terms of how teaching works and how to make it work better, but instead, as part of how it is experienced: a window into understanding the texture of teachers’ time at work. The data we gathered indicate a clear sense of participants feeling rushed and not having ‘enough’ time, with decision making experienced as consistently, if not evenly pressurised (both in time and stakes), and conducted across a wide range of activities.

Our analysis therefore contributes to our broader argument in the Teachers and Time Poverty project that time poverty for teachers is not simply about a lack of available ‘clock time’, but rather, how the nature of the time teachers currently spend at work is constituted, and the considerable variability of this.

Complexity

This highlights the complexity of what teachers do: the wide range of tasks they undertake, the kinds of decision making these demand, and the ‘typical’ unevenness and lack of predictability that require teachers to make these decisions. We think this might be a key part of what makes teaching such an exhausting (albeit worthwhile and fulfilling) job. It also points to why ‘quick fixes’ like a little less playground duty, or less after school meetings cannot, on their own, solve the enduring problem of teacher time poverty. 

Meghan Stacey is associate professor and ARC DECRA Fellow in the UNSW School of Education, where she researches the critical policy sociology of teachers’ work. Sue Creagh has most recently worked as a senior research fellow at QUT. Sue’s research interests are in education policy, national testing, and English as an Additional Language/TESOL.  Nicole Mockler is professor of education at the University of Sydney. Her research interests are in education policy and politics, professional learning and curriculum and pedagogy. Anna Hogan is associate professor and ARC DECRA research fellow in the School of Teacher Education & Leadership at QUT. Anna’s research interests are in education policy and practice, and in particular the privatisation and commercialisation of schooling. Greg Thompson is professor in the School of Teacher Education & Leadership at QUT. His research focuses on the philosophy of education and educational theory.

Distorted: this feeble report misses the boat on classroom behaviour

At an event at Parliament House earlier this year I heard that 2024 is going to be the year of education. That is excellent news given that we haven’t heard much about education from the Albanese government but, to be honest, that has been somewhat of a blessed reprieve given the hyperventilation of the previous Morrison LNP government.

I have mixed feelings about what might be coming but wouldn’t if education policy was informed by evidence rather than politics. It isn’t. The impact of that politicisation is never openly acknowledged and the policy decisions that are made (or not made) by governments are never the focus of inquiries or reviews. Instead, the “problem” is always framed by alleged deficiencies in students, parents, teachers, and/or universities.

Disagreement among panel members

Take, for example, the Senate Inquiry into the issue of increasing disruption in Australian classrooms. The interim report has just landed, and, like the final report of the Disability Royal Commission, there was disagreement among panel members. Labor and Greens senators have made additional comments that acknowledge the complexity of behaviour in schools and the Greens have only one recommendation: to fully fund public schools at the beginning of the next National School Reform Agreement in 2025. 

I was called to give evidence at the senate inquiry. At the time, I expressed concern that the Inquiry based its case for ‘increasing disruption’ on PISA data, noting first, that there are cultural and other differences between countries and second, that there are problems with the rankings. I will have more to say about the report and its recommendations in time but for now I want to take readers through points I made in the new first chapter of Inclusive Education for the 21st Century, which extend my comments from the evidence I gave to the inquiry.

Since that hearing, I have looked more closely at the data on which these claims are based and I’m frankly astonished that the Inquiry team did not do this themselves. Even a cursory glance should have been enough to signal to the Senate that these rankings were not a rigorous enough premise on which to base an Inquiry. 

Let us wade through this numerical sewage together

The claim for ‘increasing disruption in Australian classrooms’ is based on the difference in results from two surveys of 15-year-olds who participated in the OECD’s Program of International Student Assessment (PISA). 

The first survey occurred in 2009 and the second in 2018. The disciplinary climate data is based on five survey items:  

1.       Students don’t listen to what the teacher says. 

2.       There is noise and disorder.  

3.       The teacher has to wait a long time for students to quiet down.  

4.       Students cannot work well.  

5.       Students don’t start working for a long time after the lesson begins. 

Here’s where things get interesting! Here are relevant findings from the two reports.

PISA 2009PISA 2018
Participating countries were ranked on the percentage of 15-year-old students who selected ‘never or hardly ever’ and ‘in some lessons’ for Item 1 ‘Students don’t listen to what the teacher says’, and Item 3 ‘The teacher has to wait a long time for students to quiet down’.79 countries participated and 76 were ranked, however, this time the OECD developed a disciplinary climate index that encompasses all five items with some minor changes in wording.
Australia was ranked 28th for the first item and 25th for the second.Countries were ranked using their respective Index scores.
Differences between PISA 200 and PISA 2009 were calculated.Australia was ranked 69th
Australia deemed to have an average disciplinary climate that had not significantly changed between the two timepoints.
Differences between PISA 2009 and PISA 2018 were calculated 
There was a significant difference between timepoints in the responses of Australian students for only two of the five items: Item 3 ‘The teacher has to wait a long time for students to quiet down’, and Item 4 ‘Students cannot work well’
Item (5) also declined (-1.8%) but not significantly, while Items (1) and (2) improved (both +0.8%), but again not significantly.

What does all this mean?

First, Australia has not fallen from 28th or 25th in the ranking to 69th. Rather, the number of participating countries has changed over time and so therefore have the rankings. To be clear, the number of participating countries has grown from 43 (2000) to 65 (2009) to 79 (2018). And, because comparisons can only be made between countries that participated in each assessment, the number of countries in the rankings has changed from 38 in 2009 to 76 in 2018. This is not to dispute that Australia is ranked lower than anyone would like but there are problems with the rankings which render them meaningless. 

Here’s why

1)    The types of countries participating in PISA 2009 and PISA 2018 substantively changed due to the entrance of Asian countries. Unlike Australia, these jurisdictions/systems are grounded in Confucian culture, which has a profound effect on teacher-student relationships, classroom interactions, and climate. 

2)    There was a significant difference between timepoints in the responses of Australian students for only two of the five items. The case for increasing disruption in Australian classrooms therefore rests on a 3.7% decrease in the number of students saying their teacher ‘never or hardly ever’ has to wait a long time for students to quiet down, and a 2.8% decrease in the number saying students cannot work well ‘never or hardly ever’. Given that there was no difference in students’ responses between PISA 2000 and 2009, that suggests that there has been no change in more than 20 years for at least two of the five items.

3)    Countries with almost identical disciplinary index scores are ranked above and below each other. For example, Australia and Belgium received Index scores of 0.20 and 0.21, respectively yet Australia is ranked 69th and Belgium 70th. There is a snowball’s chance in hell that these scores are statistically different to each other, so why is one being ranked above the other? Doing this simply expands the number of places in the ranking which makes the distance between countries look larger than it really is.

4)    No tests of significance between countries or ranks were conducted, so we do not know whether there is a statistically significant difference in Australian students’ responses to the OECD average or how much of a difference there is between Australia and the countries at the top of the ranking. Similar points have been made numerous times over the years in relation to the rankings for student achievement in reading, mathematics, and science, but at least in those cases, countries with statistically indistinguishable performances are grouped together and given the same rank. 

5)    Recent research by Sally Larsen from the University of New England has indicated no decline in TIMMS, PIRLS or NAPLAN results of Australian students. Any observed correlations between declines in PISA’s disciplinary climate survey and student academic outcomes should not be causally interpreted.

My view

If politicians are going to look at rankings, then look at them all. Let’s consider, for example, that: 

1.     Australia is sitting at the top of ranked countries in terms of the hours that teachers spend in face-to-face teaching. 

2.     Australian teachers spend more hours teaching than the OECD average (838.28 hours/year vs 800.45 hours respectively)

3.     Korea is ranked first in classroom disciplinary climate and Australia is ranked 69th. However, Australian teachers spend 323.30 more hours per year in face-to-face teaching than their Korean counterparts, who teach just 516.98 hours/year.

4.     In disciplinary climate, the difference between advantaged students and disadvantaged students in Australia (0.34) is double that of Korea (0.17). 

These are just some of the gaps and anomalies that arise when the PISA data is subjected to close reading, which is the absolute minimum amount of analysis that should have been conducted (if not, prior, then at least) during an Inquiry that used these data for its rationale.

The questions education ministers must ask

Readers of the Interim Report, especially Education Ministers, should regard it very critically and start asking serious questions:

  • Who stands to benefit from such simple representations of these data?
  • Might there be financial benefits for non-university providers from the ‘deregulation’ of initial teacher education?
  • Are there other data that have been ignored and, if so, what does their omission suggest about rigour and bias?
  • Might Australian students tell a different story if asked by expert researchers using both open and close-ended questions? 

Are we brave enough to ask them?

Linda Graham is professor and director of The Centre for Inclusive Education at Queensland University of Technology (QUT). She has led multiple externally funded research projects and has published more than 100 books, chapters and articles. Her international bestseller, Inclusive Education for the 21st Century: Theory, Policy and Practice, is now in its second edition. In 2020, Linda chaired the Inquiry into Suspension, Exclusion and Expulsion processes in South Australian government schools. She also gave evidence to the Royal Commission into Violence, Abuse, Neglect and Exploitation of People with Disability on the use of exclusionary school discipline and its effects.