As researchers, we care that our educational systems improve, support all learners, and are grounded solidly in research evidence. But how do we work with stakeholders like educational technology startups to support effective use of that evidence? Researchers and practitioners worry about this, because we care about evaluating and scaling good ideas. By ‘scaling’ we mean adjusting and improving good ideas as they are rolled out and used.
Some common ways that people think about how we build evidence and scale innovations include:
- taking approaches tested in controlled settings and implementing them
- looking for ‘success stories’ and trying to copy lessons from them and
- taking a systematic approach to analyse context for places to change and evaluating these changes, the Improvement Method.
Research on how we use evidence in policy and practice (and policy practice) can help inform us when we try to work with startups and other stakeholders on education projects. Professor of Politics and Public Policy at the University of Stirling in the UK, Paul Cairney, compares the three approaches in the table below.
Three approaches to evidence-based policy-making
Emulate Approach
In much work in education, we are looking to implement programs or technologies in contexts using an emulation approach; copying tested interventions. In our teaching that can also result in coming at research from a top down perspective, using key studies and methods but with a disconnect from local needs and context.
But these interventions are critiqued for their simplicity in the education context because they imply that interventions occur in a vacuum rather than in a complex context where we’ve already got lots of interventions going on. We might be evaluating a program that has already been implemented, and often our implementation process doesn’t follow this linear model.
Storytelling Approach
The push back against the emulation approach is sometimes to instead focus very heavily on local context and storytelling approaches. This approach respects the expertise of professionals – which is important – but can result in key lessons not being distilled and shared, idiosyncratic ‘hit or miss’ practices, and ad hoc improvement cycles that may be driven by particular interests.
In the edtech space, much of the evaluation conducted by providers is based on testimonials. Although these can be useful, they’re typically not going to get at deeper issues of learning or help us evaluate our work.
Improvement Methods
So, then, Improvement methods have been adopted in education systems, for example explicitly by the Carnegie Foundation, an independent research and policy centre in the US, and arguably in other forms such as Research Practice Partnerships (which are collaborative, long-term relationships between researchers and practitioners, designed to improve problems of practice in education) and other design based research approaches. Because these approaches work closely with practitioners to connect theory and real-world problems, they attempt to avoid ‘transmissive’ communication (one way communication) of research.
Our UCL EDUCATE project
At UCL (University College London) – which Simon recently visited while on sabbatical – the EDUCATE project has been created to help build a stronger evidence base in the EdTech sector. It uses this kind of approach. The approach is visualised through the ‘golden triangle’ connecting EdTech companies, entrepreneurs and start-ups with first-class business trainers, experts and mentors.
The Golden Triangle of evidence-informed educational technology
The UCL EDUCATE project worked with 252 small to medium-sized enterprises (max 250 employees, <£5m annual turnover) in 12 cohorts between 2017-19. The idea was to get EdTech creators, educators, investors and policy makers working together to understand what “works for learners and how to use technology to serve its users effectively.” As the program developed, it shifted from more general introductions to research methods and established research knowledge, to greater recognition that the nature of evidence is both varied and serves different purposes for enterprises at different stages of development.
The EDUCATE programme avoided the issue of transmissive or emulation-based research by building capacity in educational technology enterprises to conduct their own research, using theories of change to generate practical, robust, research. The aim, then, isn’t just to translate research into practice, or implement outcomes from RCTs, but to try and move from storytelling about products, to an improvement mindset.
UTS Implementing Learning Analytics
In the work we’ve been conducting at the University of Technology Sydney we’ve taken a kind of improvement based approach, by looking at existing teaching practices, and seeking to augment those practices, rather than simply dropping in a new technology without understanding the context, or with a requirement for a particular type of teaching for it to be used. Our focus is improvement-oriented innovation. This approach is intended to improve adoption and support existing good practices by learning from them and to amplify them through the technology.
We believe it is important, when we think about the role of new technologies and approaches in education, to consider the way we use evidence. Understanding the different approaches – implementation, storytelling, or improvement – and how they work to achieve impact can be invaluable to all stakeholders.
Simon Knight is a Senior Lecturer at the Faculty of Transdisciplinary Innovation at the University of Technology, Sydney. His research investigates how people find and evaluate evidence, particularly in the context of learning and educator practices. Dr Knight received his Bachelor’s degree in Philosophy and Psychology from the University of Leeds before completing a teacher education program and Philosophy of Education MA at the UCL Institute of Education. Following teaching high school social sciences, Dr Knight completed an MPhil in Educational Research Methods at Cambridge, and PhD in Learning Analytics at the UK Open University. Simon is on Twitter @sjgknight
Anissa Moeini is a doctoral candidate at the UCL Knowledge Lab, Institute of Education, University College London, UK. As a seasoned tech entrepreneur, Anissa identified the need to build research capacity in edtech enterprises that is both agile to their pace of change and also adaptable to the rhythm of SMEs. Through her doctoral research she developed the Evidence-informed Learning Technology Enterprise Framework (ELTE) as a practical tool for edtech companies and other non-academic stakeholders (investors, policymakers and education practitioners) to both evaluate the efficacy of edtech enterprises (i.e. their products and services) and to build capacity to be evidence-informed. Anissa completed her MA at Teachers College, Columbia University in NY, USA and her iBBA at the Schulich School of Business in Toronto, Canada. She will be defending her doctoral dissertation in 2020. Anissa is on Twitter @AnissaMoeini
Alison Clark-Wilson is a Principal Research Fellow at UCL Knowledge Lab, UCL Institute of Education, London. Her research spans the EdTech sector with a particular emphasis on the design, implementation and evaluation of technology in real school settings. Dr Clark-Wilson received a Bachelor’s degree in Chemical Engineering prior to becoming a secondary school mathematics teacher in the early 1990s. Her 30-year career has spanned school, university and industry-based education contexts. Dr Clark-Wilson completed a MA at the University of Chichester and a PhD from UCL Institute of Education, both in mathematics education. Alison is on Twitter @Aliclarkwilson