Imagine you're the head of a high school Maths department. Every year, you face a challenge: a group of students will enter from primary school struggling with basic maths skills.
You know that these students won't just struggle in Maths but will find it difficult to access the curriculum in other subject areas, too. If you can help them close the achievement gap with their peers, they'll be much more likely to have success throughout high school and life.
Of course, there are a number of programs or approaches you could choose to meet this challenge. You might know about them from colleagues in your professional network, from the system or association you're part of, or from your online community. How do you know which one to choose?
The value of evidence
School leaders face challenges like this regularly. To choose wisely in the face of such challenges, they should look to high quality evidence to support their professional judgement. Robust evidence can give them confidence that the approaches they select have the best chance of working for their students. If they don't look to rigorous evidence, they run the risk that they are wasting time and resources on things that are ineffective or, at worst, could actually do harm.
For example, an independent evaluation of a pull-out reading program in the UK found it raised students' reading levels over the course of a term, but similar students who just stayed in class actually raised their reading levels more. Schools using this program would have improved their outcomes by keeping students in class, rather than pulling them out for the program. Only a well-designed study picked this up, though.
Closer to home, a recent Australian study published in The Lancet showed that a school-based intervention designed to reduce teenage pregnancy rates actually increased them. So, when educators think about evidence-informed or evidence-based practice, they need to think carefully about the rigour of the evidence they consider.
One resource that can help them with this is the Teaching and Learning Toolkit – a free online summary of rigorous global research into educational approaches ranging from arts participation and feedback to reducing class size and repeating a year. It was commissioned by the UK's Education Endowment Foundation, and is hosted by Evidence for Learning (E4L) in Australia.
We hope the toolkit offers a good introduction for educators who want to inform their decisions with strong evidence. For each approach, it provides three key pieces of information: how many extra months' progress students make in a year; how much the approach costs; and, how secure the evidence is. Dedicated pages for each approach give further detail, including key things to think about if a school is considering implementing an approach.
Unfortunately, there is not always good evidence about particular programs or approaches schools might be considering. The developers of a program may have anecdotal data (or great testimonial quotes in their brochure) suggesting that the program works, and the school down the road may say that it worked for their students. Even the best evaluated programs may only have before-and-after data or correlational evidence. Often, this is the best evidence a school leader can find to suggest a program might be effective for their students.
Addressing the gap
At E4L, we are trying to address that gap through the Learning Impact Fund. This fund aims to identify promising Australian educational programs, fund them to deliver at a new level of scale, and commission rigorous, independent evaluations, typically in the form of a randomised controlled trial (RCT). The power of an RCT is in allowing researchers to say that a program caused a particular outcome, controlling for all other likely factors. The two grantees in the Learning Impact Fund's launch round were announced in May this year:
- QuickSmart Numeracy, a one-to-one tuition program to increase fluency and automaticity in basic mathematics for students performing in the bottom third of their cohort; and
- Thinking Maths, a professional learning program that supports Year 7 and Year 8 teachers in the deep learning of mathematical content as outlined in the Australian Curriculum Mathematics.
We have selected these programs because they are already based on good evidence. The QuickSmart approach derives from cognitive psychology research which underscores the benefit of automatic recall of number facts in reducing cognitive load while solving more complex problems. When students can easily remember basic number facts, they can use their mental bandwidth to do more complex work.
Thinking Maths draws on a New Zealand research synthesis by Helen Timperly and colleagues highlighting features of professional learning in mathematics that are effective in improving student outcomes. For example, Thinking Maths is coherent with policy – it is aligned to the South Australian Teaching for Effective Learning framework – and it deepens teachers' understanding of mathematics and how students learn mathematics.
Because of this strong evidence base, we are pleased to fund the expansion of these programs. We also want to ensure that these programs make good on the promise of their evidence base, so we have commissioned independent evaluations of the programs using RCTs. The results of these trials will be published on the Evidence for Learning website.
Schools involvement in building new evidence
Of course, the launch round only covers two programs. To extend this approach to more programs across Australia, we have recently opened two rounds in the Learning Impact Fund: (1) a Resilience Round, co-funded with VicHealth, and (2) a General Round.
The Resilience Round offers grants for program delivery and evaluation. VicHealth has contributed $100,000 for program delivery and Evidence for Learning $50,000 for evaluation. The aim is to identify promising school-based initiatives that build social and emotional learning and resilience skills in Victorian students, and which also improve academic achievement.
The General Round is open to any education program designed to improve the academic achievement of children in Australia. We welcome applications from all education program developers, including non-profit and for-profit organisations, Catholic education offices, charities, universities, government bodies and social enterprises. Schools with programs that may be suitable for scaling to other schools are also welcome to enquire.
For more information, visit the Evidence for Learning website or contact John Bush: jbush@evidenceforlearning.org.au
Josh Bush writes: 'Robust evidence can give [educators] confidence that the approaches they select have the best chance of working for their students'. Where do you go to source 'robust evidence'?
As an educator, how often do you take the time to think about evidence-informed or evidence-based practice, as well as the rigour of the evidence you're presented with? How does this inform your decisions?