Tes talks to…Steve Higgins

6th July 2018 at 00:00
The academic who co-created the EEF teaching and learning toolkit talks to Chris Parr about the misapprehensions surrounding his invention and how educators can make the most of the information it contains for the benefit of students

I guess I’m proud of what we have produced,” admits Steve Higgins, a fellow of the Wolfson Research Institute for Health and Wellbeing at Durham University and professor in the university’s School of Education. Higgins is referring to his role in the creation of the Education Endowment Foundation (EEF) teaching and learning toolkit, and his reticence to admit pride may confuse some.

The kit started life seven years ago as a mere 20-page Sutton Trust report, described as an “easily accessible guide for teachers detailing the approaches they should consider when allocating the government’s pupil premium”. However, this was no ordinary report. It purported to have distilled the learning from thousands of educational studies, involving millions of pupils across the world, into 20 different approaches to improving learning – estimating the extra progress over the course of a school year that the average student might expect to make if each strategy was adopted.

Fast-forward seven years and the toolkit has taken on a life of its own. It has been published in an interactive format online, translated into Spanish and Portuguese for the Latin American market, and special Scottish, Australian and early-years versions have been produced, too.

Yet despite all this success, Higgins – one of the authors of the original report and a consistent figure in the toolkit’s development – clearly has some reservations. “You can never quite predict when you plan and create something like this how it will be used,” he says. “You have your own intentions about what you hope it will achieve and then, of course, it goes wild and it is difficult to ensure that it is used in the way you intended.”

“Go wild” the toolkit has. It now summarises the findings of some 13,000 studies and, according to the Sutton Trust, is used by more than half of secondary school leaders. A report by the National Audit Office found that two-thirds of headteachers use it for guidance.

But such accessibility can come at a cost, and it has been argued that the clarity with which the toolkit displays research findings encourages the making of snap decisions. Spending just 20 seconds scrolling through the entries on the kit’s website, for example, is enough to learn that offering one-to-one tuition is expensive and only moderately impactful, having school uniform makes very little difference to attainment, and having pupils repeat a year is both very expensive and damaging – setting pupils back about four months.

Higgins agrees that the toolkit’s “somewhat simplistic” design, inspired by consumer information services such as Which?, has created problems.

“[Clearly displaying] a quality rating, cost rating and impact rating was definitely successful at getting people’s attention and getting them to notice that surface level information. But then, how do you get people to look deeper?” he asks.

Higgins recalls two examples of where what he feels was misinterpretation of the toolkit’s advice led to controversial decision-making by school leaders.

“The first was around the deployment of teaching assistants. In the initial version of the toolkit, the average impact of deployment in terms of general classroom support was, overall, at zero. I still think that was accurate in terms of what the evidence said in the UK and internationally, but what some people took from that was that schools shouldn’t employ teaching assistants – and that wasn’t what we were arguing at all.”

Rather, the evidence at the time suggested that if schools believed teaching assistants would directly benefit pupil attainment when used in the way that they were typically deployed, then they may be disappointed.

“Of course, teaching assistants are deployed for all types of reasons, particularly around special educational needs, and are certainly valuable at getting what I call the physical and social inclusion of children in classrooms – they are absolutely necessary for that,” Higgins says. “They also provide support for teachers and reduce stress in schools. Our key point was that if you are also expecting them to have an impact on outcomes for children, then you may need to look more closely at that deployment.”

Budget savings

Higgins knows now, however, that this nuance was missed by some – particularly school leaders looking for ways to make budget savings and a government looking to trim public sector spending.

“I know that some schools, as a result, made some decisions about teaching assistant deployment, as did the Treasury. I think we spend about £4 billion a year on teaching assistants in England, so you can see why they came under very close scrutiny. There was a lot of negative feedback from schools.”

Although Higgins heard about schools using the kit to justify getting rid of teaching assistants, he says that nobody ever came to him directly to discuss such actions. “Anecdotally, I heard that it was leading to schools reconsidering their decisions about teaching assistant employment, and obviously I would have challenged that,” he says, adding that he hopes additional guidance on the use of teaching assistants, published by the EEF, will help people “appreciate that the story is a bit more complex, that we were not arguing for [reducing the number of teaching assistants]”.

The second example was in Scotland, where there was reportedly discussion in some local councils about increasing class sizes because of evidence presented by the toolkit. “What we argue in the kit is that reducing class sizes has a low positive benefit but it is very expensive,” Higgins says. “To get a class size down to about 15, which it looks like is where you start to see really valuable improvement, would be hugely expensive. The research was included to get people to think about other approaches that might be more effective in terms of improving outcomes.”

Higgins says he found it “quite shocking” when he heard that some of the schools in Scotland were using the toolkit as a justification for increasing class sizes.

“It just shows that they haven’t actually read what the toolkit entry says. There is a fairly flat but linear relationship – overall, the smaller the class, the better it tends to do, with the benefits really increasing once you get below 20 pupils. There is no justification at all in the evidence for increasing class sizes.

“If they do creep up slowly, you are likely to see, over time, an overall disadvantage for the larger classes.”

He describes the interpretation as “a clear misreading of the toolkit – using evidence about the lack of cost benefit for reducing class sizes to argue the opposite and increase them wasn’t at all what we intended, and it clearly wasn’t what we were saying”. In response, Higgins says he produced some further summaries of the evidence with a bit more detail to send to people involved.

Regular criticisms

In addition to the misapplications detailed by Higgins, there have been regular criticisms of the toolkit since its inception. Last year, Terry Wrigley, a visiting professor at Northumbria University, and Gert Biesta, a professor of education and director of research in Brunel University London’s department of education, said that the toolkit was “bundling together” very different studies. They described it as “extremely misleading and utterly unhelpful”.

Higgins seems unmoved by the critics. “I acknowledge its limitations…but all we are trying to build is further understanding of how to bring about effective and positive change in schools. Otherwise, what do you do? Pick things randomly?

“The EEF is commissioning studies and then adding those studies to the toolkit, along with all of the other recent studies in education. So, it is not like it is a static evidence document – we are trying to create it as a sort of living review where, over time, hopefully the precision will increase.”

Higgins has big plans for the next phase. By adding more granular detail from studies, he hopes to produce a database that would allow more targeted toolkits, tailored for a secondary science teacher or an early-years teacher interested in literacy outcomes, for example. “If we can get sufficient interest in what we are doing, we may be able to create something that has greater precision. That, for me, is the exciting bit,” he says.

While that precision is honed, though, exactly how should schools and councils be using the information in the toolkit? And has presenting the findings in such a simple way meant that one aim – accessibility – has been achieved with the unintended consequence of ill-informed decision-making?

“I hope that what people are using it to do is to have active discussions about what they intend to do and how they intend to bring about improvement in schools, and if that is the case, then I am certainly all for that.”

He advocates that schools and teachers identify challenges or difficulties they want to address and then use the toolkit as a source of evidence to test out whether their ideas in this area are likely to be cost-effective and beneficial, rather than simply working through the different interventions.

“I see the toolkit more as a risk register: things [rated] at the top are likely to be good bets and things at the bottom are likely to be poor bets,” he says. “But how well a particular school implements a policy or approach – plus what the specific conditions and challenges are in that school – may mean that they benefit from something that doesn’t always perform particularly well.

“Also, things which are usually good bets might not actually work well for a school if it is something they are already good at. Getting people to make nuanced professional decisions around evidence is very hard with the current toolkit framework and structure.”

Higgins hopes that, in the next few years, teachers will begin to view the toolkit as “a guide to think about, rather than something that should determine what you want to do”.

“I think of it as a guide to what has worked, past tense. The idea that this is what will work in the future is a much riskier proposition. My personal view is, overall, it’s been useful – it’s helped us understand what we are trying to do and it will help us do it better in the future. But it is not perfect.”

Chris Parr is a freelance journalist

Subscribe to get access to the content on this page.

If you are already a Tes/ Tes Scotland subscriber please log in with your username or email address to get full access to our back issues, CPD library and membership plus page.

Not a subscriber? Find out more about our subscription offers.
Subscribe now
Existing subscriber?
Enter subscription number

The guide by your side – ensuring you are always up to date with the latest in education.

Get Tes magazine online and delivered to your door. Stay up to date with the latest research, teacher innovation and insight, plus classroom tips and techniques with a Tes magazine subscription.
With a Tes magazine subscription you get exclusive access to our CPD library. Including our New Teachers’ special for NQTS, Ed Tech, How to Get a Job, Trip Planner, Ed Biz Special and all Tes back issues.

Subscribe now