“This is the EEF toolkit 3.0.”
The words of Professor Steve Higgins, lead author of the Sutton Trust/EEF Teaching and Learning Toolkit, underline why the next time teachers visit the Education Endowment Foundation’s (EEF) toolkit pages something entirely new awaits them.
“They’re definitely won’t miss the changes,” adds EEF head of policy Jon Kay.
And this is not just because the toolkit has a new colour scheme.
As of today the entire scope of the toolkit has been expanded to offer a far wider array of insights on the different approaches schools can use to boost outcomes – from assessment and homework, to the use of technology, peer feedback, small study groups, behaviour and much more.
“We've basically taken every single underlying study in the toolkit and reviewed whether it is of high quality and whether it deserves to continue to be included,” says Kay.
“And then we've extracted a lot more information from each study – where it takes place, what the impact is, the methodology, the context and lots more details on the pedagogical approach.”
The updated EEF Teaching and Learning Toolkit: a huge task with a clear goal
As this suggests, the work involved has been exhaustive with Kay and Higgins explaining that some 2,500 research studies were analysed and broken down to provide a very granular level of insight – by phase, setting and even subjects where possible.
“It’s taken three and a half years and in that time we have unpacked all the previous analysis and research and created a single database so we can compare effects by age, subject, school,” says Kay.
“It means schools can now dig in and look in more detail about the impact and the differences [between interventions].”
This is a notable development and should give teachers the opportunity to more properly assess how they can drive improvements in specific areas and see meaningful outcomes.
“There'll be a section which describes variation by phase, variation by subject – if there is a meaningful result that communicates that,” says Kay.
He adds that doing this work was the “responsible” thing to do as it means schools can be much more informed about how an intervention could help in their specific subject – and in doing so avoid investing time and effort into something that may not actually suit their subject that well.
“For example, with mastery learning there are higher impacts in maths than in English as English mastery is emerging as a practice,” says Kay, explaining that with these sorts of insights schools should be able to more accurately assess the right interventions for their needs.
The importance of implementation
Complementing this is an increased focus on advice for implementing the ideas being reviewed as well.
“Implementation is a key message the EEF have spoken about for a while now so [we will] be pulling out more information [from the reviews] about what are the key things that you should consider when implementing a high-quality peer tutoring approach or small group tuition approach, for example,” adds Kay.
Furthermore, as part of the more fine-grained approach, the toolkit now offers, it has also been updated to provide a specific focus around how each intervention can help to tackle attainment gaps.
“The toolkit analyses evidence for all pupils, but our key focus of the EEF is closing the attainment gap and primarily the toolkit is used for considering pupil premium funding,” says Kay.
“So there will be a specific focus that talks about where things might have a differential impact to pupils from a disadvantaged background.”
An example of how all this slots together to provide a more detailed view for teachers can be seen in the following text from the "small group tuition" page:
- Impact tends to be greater in primary schools (+4 months) than secondary schools, which have fewer studies overall and a lower impact (+2 months).
- Most of the research on small group tuition has been conducted on reading and there is a greater impact, on average (+ 4 months). The studies in mathematics show a slightly smaller positive impact (+ 3 months).
As you can see, there are details on phase, subject and cohort size to give teachers more insights on how this intervention could help with their aims, while the new "closing the disadvantage gap" tab means teachers can quickly assess its impact on this area, too.
Embedding a digital focus
As well as these developments, another notable new area in the toolkit is around digital technology. In the old toolkit, digital technology was its own strand. This though, meant it was too broad, says Kay.
“It was too divergent. What does digital technology mean – is it using an intelligent tutoring system or giving children an iPad?”
However, while the strand has been removed, advice on how to use digital technology has not disappeared. Quite the opposite, as it has now been given great prominence than ever with digital technology a separate sub-section on almost all the strands outlining how it can be incorporated.
“What we've done is analysed the impact of digital technology in the constituent strands. So rather than having a strand that's focused on digital technology, we say, 'What's the impact of homework when delivered using digital technology?'” says Kay.
“So it might say that using digital technology for flipped learning has a more positive impact than average, but for feedback it has a lower overall impact.”
Higgins adds: “We want teachers to think about technology within teaching and learning approaches and so it makes much more sense to pull out studies within each area of the toolkit.”
Given the huge strides that almost all schools have made in using digital technologies during the pandemic, this could well prove to be highly useful if certain interventions are shown to be more effective through technology.
Room for teacher expertise
Of course, despite all these updates and the greater level of detail being provided, it does not mean the information provided is foolproof, and teachers will still have to apply their professional expertise to consider how different ideas may suit their setting.
“One of the challenges is that evidence changes and evolves as new studies come and the impacts go up or down,” says Higgins.
“The evidence can never tell you exactly what works in every setting. [Teachers] have to use their professional expertise when using the toolkit.”
Teachers won’t be left on their own, though, with the update to the toolkit also resulting in it becoming a more dynamic platform with the added ability to incorporate new studies as they are published and further refine the information presented to ensure it is offering the most up-to-date insights.
“In practice, once you've got 100 or so studies in the database, adding one or two more is very unlikely to make much difference,” says Higgins.
“But over time, the precision will increase and the detail of what we can say will also improve.”
A fortuitous bit of timing
This should all be good news for schools as they seek to address the damage done by the pandemic and catch up on the lost learning of the past 18 months by helping them understand the interventions that can help do this best.
What’s more, the arrival of this new souped-up tool kit should also help schools to better adhere to a new government directive that from autumn schools must demonstrate how pupil premium funding decisions are “informed by research evidence”.
CEO of the EEF Professor Becky Francis says it was “serendipitous” the update arrived at the same time as this new requirement – but certainly no bad thing: “It’s very fortuitous the update comes as the government focuses even more tightly on evidence-led spending of pupil premium funding and we are able to offer this additional rigour and scale for the toolkit,” she says.
“It will help schools think about their specific needs and drill down into the evidence and zero in on the ‘best bets’ from the research for their priorities.”
And for her, this is what the toolkit and the latest update are truly about – helping schools to do the best for their pupils.
“It’s been such a challenging time for schools, so now, more than ever, they will be making difficult decisions on how to support pupils’ recovery – using research evidence can really help with that.”