Toolkit puts ‘best bets’ at teachers’ fingertips

The updated Sutton Trust/EEF Teaching and Learning Toolkit offers a far wider array of insights into the different approaches schools can use to boost outcomes
17th September 2021, 12:05am
The Updated Eef Teaching & Learning Toolkit Offers A Wider Array Of Advice For Teachers On Improving Outcomes

Share

Toolkit puts ‘best bets’ at teachers’ fingertips

https://www.tes.com/magazine/teaching-learning/general/toolkit-puts-best-bets-teachers-fingertips

“This is the EEF toolkit 3.0.” The words of Steve Higgins, lead author of the Sutton Trust/EEF Teaching and Learning Toolkit, underline just how different your next visit to the Education Endowment Foundation’s (EEF) pages will be.

“They definitely won’t miss the changes,” adds EEF head of policy John Kay. And this is not just because the toolkit has a new colour scheme.

As of today, the entire scope of the toolkit has been expanded to offer a far wider array of insights into the different approaches schools can use to boost outcomes - from assessment and homework to the use of technology, peer feedback, small study groups, behaviour and much more.

“We’ve basically taken every single underlying study in the toolkit and reviewed whether it is of high quality and whether it deserves to continue to be included,” says Kay.

“And then we’ve extracted a lot more information from each study - where it
takes place, what the impact is, the methodology, the context and lots more details on the pedagogical approach.”

As this suggests, the work involved has been exhaustive, with Kay and Higgins explaining that some 2,500 research studies were analysed and broken down to provide
a granular level of insight by phase, setting and subjects where possible.

“It’s taken three and a half years and, in that time, we have unpacked all the previous analysis and research, and created a single database so we can compare effects by age, subject and school,” says Kay, adding that “it means schools can now dig in and look in more detail about the impact and the differences” between interventions.

Meaningful outcomes

This is a notable development and should give teachers the opportunity to better assess how they can drive improvements in specific areas and see meaningful outcomes.

“There’ll be a section which describes variation by phase, variation by subject
- if there is a meaningful result that communicates that,” says Kay.

He adds that doing this work was the “responsible” thing to do as it means schools can be much more informed about how an intervention could help in their specific subject - and, in doing so, avoid investing time and effort into something that may not actually suit their subject that well.

“For example, with mastery learning, there are higher impacts in maths than in English, as English mastery is emerging as a practice,” says Kay, explaining that, with these sorts
of insights, schools should be able to more accurately assess the right interventions for their needs.

Complementing this is an increased focus on advice for implementing the ideas being reviewed as well.

“Implementation is a key message the EEF has spoken about for a while now so [we will] be pulling out more information [from the reviews] about what the key things are that you should do when implementing a high-quality peer tutoring approach or small-group tuition approach, for example.”

Furthermore, as part of the more fine-grained approach, the toolkit has also been updated to provide a specific focus around how each intervention can help tackle attainment gaps.

“The toolkit analyses evidence for all pupils but a key focus of the EEF is closing the attainment gap and, primarily, the toolkit is used for considering pupil premium funding,” says Kay.

“So there will be a specific focus that talks about where things might have a differential impact to pupils from a disadvantaged background.”

An example of how all this slots together to provide a more detailed view for teachers can be seen in the following text from the “small-group tuition” page:

  • Impact tends to be greater in primary schools (+4 months) than secondary schools, which has fewer studies overall and a lower impact (+2 months).
  • Most of the research on small-group tuition has been conducted on reading and there is a greater impact, on average (+4 months). The studies in mathematics show a slightly smaller positive impact (+3 months).

 

As well as these developments, another notable new area in the toolkit is around digital technology. In the old toolkit, digital technology was its own strand. This, though, meant it was too broad, says Kay.

“It was too divergent. What does digital technology mean - is it using an intelligent tutoring system or giving children an iPad?”

Greater digital prominence

However, while the strand has been removed, advice on how to use digital technology
has not disappeared. Quite the opposite, in fact, as it has now been given greater prominence than ever, with digital technology given a separate sub-section on almost all the strands, outlining how it can be incorporated.

“What we’ve done is analysed the impact of digital technology in the constituent strands. So, rather than having a strand that’s focused on digital technology, we ask, ‘what’s the impact of homework when delivered using digital technology?’” says Kay.

“So it might say that using digital technology for flipped learning has a more positive impact than average but, for feedback, it has a lower overall impact.”

Higgins adds: “We want teachers to think about technology within teaching and learning approaches, and so it makes much more sense to pull out studies within each areas of toolkit.”

Given the huge strides almost all schools have made in using digital technologies during the pandemic, this could well prove highly useful if certain interventions are shown to be more effective through technology.

Of course, despite all these updates and the greater level of detail being provided, it does not mean the information is foolproof, and teachers will still have to apply their professional expertise to consider how different ideas may suit their setting.

“One of the challenges is that evidence changes and evolves as new studies come, and the impacts go up or down,” says Higgins.

“The evidence can never tell you exactly what works in every setting. [Teachers]
have to use their professional expertise when using the toolkit.”

Teachers won’t be left on their own, though, as the update to the toolkit also
sees it become a more dynamic platform, with the ability to incorporate new studies
as they are published and further refine the information presented to ensure it is offering the most up-to-date insights.

“In practice, once you’ve got 100 or so studies in the database, adding one or
two more is very unlikely to make much difference,” says Higgins.

“But, over time, the precision will increase and the detail of what we can say will also improve.”

This should all be good news for schools as they seek to address the damage done by the pandemic and catch up on the lost learning of the past 18 months by helping them understand the interventions that can help do this best.

What’s more, the arrival of this new souped-up toolkit should help schools better adhere to a new government directive that, from autumn, schools must demonstrate how pupil premium funding decisions are “informed by research evidence”.

Chief executive of the EEF Becky Francis says it is “serendipitous” that the update has arrived at the same time as this new requirement but certainly no bad thing: “It’s very fortuitous that the update comes as the government focuses even more tightly on evidence-led spending of pupil premium funding, and we are able to offer this additional rigour and scale for the toolkit,” she says.

“It will help schools think about their specific needs, drill down into the evidence and zero in on the ‘best bets’ from the research for their priorities.”

And for Francis, this is what the purpose of the toolkit, and the latest update, is truly about - helping schools to do the best for their pupils. “It’s been such a challenging time for schools so now, more than ever, they will be making difficult decisions on how to support pupils’ recovery - using research evidence can really help with that.”

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared