Last week the government announced two major overhauls to the pupil premium (PP) conditions of funding.
The first was that from the academic year 2021-22, “schools must demonstrate how their spending decisions are informed by research evidence” and “make reference to a range of sources” in doing so.
The second was that by the end of December 2021, all schools must use templated documents to outline their 2021-22 pupil premium strategy, removing the option for schools to choose how they present their strategies.
These changes were announced with little fanfare, though, with the research requirement warranting just a few lines as an aside in the summer school funding catch-up plans.
So what do the new measures mean for schools and will they make a meaningful difference – or just create more paperwork?
The changes to pupil premium conditions of funding
1. Research requirements
Of the two changes, perhaps the more notable is the requirement for schools to reference research that has been used to justify PP strategies and spending.
Yet there is no direct guidance from the Department for Education on what sort of research this may be or what is acceptable.
That is except to advise schools to use the Education Endowment Foundation’s toolkit to guide them. This toolkit document offers a breakdown of 35 different areas of school life – from school uniform to bullying, homework to class sizes, with each area covered including a subheading "How secure is the research?".
Professor Becky Francis, CEO of the Education Endowment Foundation, says that consolidating this focus on research is a good move and should help to ensure past successes are built on in the future.
"Evidence of what has worked for schools in the past - from the EEF Toolkit as well as other external sources – puts us in a much better position for judging what is likely to work in the future," she says.
"So it is welcome that the DfE is recommending that schools use evidence to support their decision making. Combined with teachers’ professional judgement and knowledge of their own pupils, evidence can help schools use their resources in ways that really make a difference for their pupils.”
Furthermore, the DfE reiterates that the research should be used to inform three keys areas of PP strategies:
- Support the quality of teaching, such as staff professional development.
- Provide targeted academic support, such as tutoring.
- Tackle non-academic barriers to success in school, such as attendance, behaviour and social and emotional support.
Steve Edmonds, director of advice and guidance at the National Governance Association (NGA), who has recently updated the organisation’s guidance to members on what these changes mean, says that, overall, the explicit move to focus on using research to inform spending is not surprising.
“That’s always been accepted, really – it’s certainly always been in our guidance that school leaders should point governing boards to research that supports decisions on pupil premium, so there are evidence-based strategies. It’s something governing boards expect to see.”
What’s more, he says that the guiding principles that should inform PP spending and its outcome are also nothing new, and so any research already used should align with these goals.
“Those pointers are clearly from the EEF document and reinforce a guiding principle that to raise attainment is to improve quality of teaching – this was something [education recovery commissioner] Sir Kevan Collins said last week,” Edmonds says.
“It’s not something that hasn’t been said many times before and from an NGA perspective, we have always advised schools to be evidence-based, with a high impact low-cost approach, based on research, so that does not appear to have changed too much.”
Julia Hinchcliffe, headteacher at Orchard School in Bristol, agrees that, overall, the move to the requirement for research being referenced in PP strategy documents makes sense and should not be too onerous
“It is a useful guide, and we [SLT] all agree it’s sensible as professionals to make use of evidence-informed decisions on spending this premium. We are already trying to do so and have made use of EEF Toolkit for years now.”
What is considered 'good' research?
However, she offers a caveat to this by saying that the idea around what research will be allowed – or what needs to be continually updated – needs to be managed carefully to avoid it becoming burdensome.
“Where it’s ‘obvious’, I don’t want to spend time looking for evidence. For example, our school handed out over 300 laptops to students in receipt of PP during the most recent lockdown,” she says.
“I don’t think anyone would expect this to be ‘evidenced’ – it’s obvious that if a child doesn’t have a dongle or laptop that the quality of teaching and learning they can access will be limited, and gaps will become greater.”
Furthermore, she notes that leaving schools to become the arbiters of what is deemed acceptable research could be a burden.
“I received an email today informing me of research from the University of Leeds psychology department that skipping breakfast at KS4 impacted by almost two grades on GCSE outcomes,” she notes.
“This research was sponsored by the Economic and Social Research Council, but I have no knowledge whether this is ‘good’ evidence or not. I don’t have access to the Frontiers in Public Health journal to read it for myself. So some independent support with collating research evidence would be helpful. “
Daniel Woodrow, headteacher of St Gregory CEVC Primary School in Sudbury, Suffolk, also has this concern.
“What research is the right research? Or does it matter who funded it? What if there is another piece of research that says the exact opposite? Or if it mentions a specific product in that research, might schools feel obliged to only use that product?
“If research came out that said the best thing we can do is not worry about academic attainment for the rest of this term, and we followed that and we explained that in our template, would that be OK?”
This last point is perhaps not meant as a serious scenario, but it underlines the real-world complications that an on-paper, sensible-sounding "research-informed" approach creates
Researching how to research
It also shows that, understandably, there could be a requirement for teachers to become more skilled at understanding how to find the "right" research, discern if it is useful and how to interpret it – no mean feat.
One organisation that Hinchcliffe suggests could help tackle these challenges is the Chartered College of Teaching.
Cat Scutt, director of education and research at the Chartered College of Teaching, tells Tes there are no direct plans to create guidance on this new requirement but believes existing programmes can help to boost expertise in this area as the focus on research becomes a centralised requirement.
“We are not planning on anything in response to this requirement, but I think the wider work we are doing, such as the Certification in Evidence-Informed Practice, which helps teachers to know what to look for in research, to avoid cognitive biases and so forth, can help here,” she says.
Furthermore, she says this ongoing focus on research to inform practice in education is to be welcomed, and the new requirement is not a surprise from a policy point of view. However, Scutt worries that one unintended consequence could be that schools become overly focused on the exemplar template.
“It may be that people download the exemplar template and think, ‘This is how we should be spending the money.’ But there isn’t a single right answer when it comes to research. You have to use research evidence combined with your professional judgement and the context you’re in," she says.
“So we need to avoid there being an idea that there is an exact plan every school should be doing but instead make decisions in their context, using their knowledge alongside research.”
Pick your own?
This reference to a school using its own knowledge leads to an unanswered question, though – will a school’s own research internally count as being acceptable?
The DfE hasn’t said anything explicitly on this, although further guidance is expected that may cover it.
If and when it does, the hope in schools will be that it is deemed permissible: “I assume research we have generated from within our own school is deemed sufficiently ‘valuable’, too,” says Hinchcliffe.
This is no small point either. After all, who knows better than a school’s leadership team and governors what challenges it faces and how money may be best spent, if based on years of insights built up from that specific settings?
Julie Cassiano, the headteacher of Vernon Terrace Primary School in Northampton, sees it this way: “I would argue that there are schools who are successfully closing the [attainment] gap using internal action research of years of personal evaluative processes.”
As such, she says, there needs to be an understanding that “proving impact rather than how you are getting the impact” is important.
Woodrow also makes this point, noting that a big silver lining of the pandemic was schools getting to know their communities better and what families and pupils need and reacting accordingly – something that may now be hampered.
“One thing we are looking at doing is splitting one cohort into two classes – we have two experienced teachers to do this and have the budget to make it work. But now, do I have to go through and find research and split them into smaller classes workers?”
It remains to be seen if the DfE deems this acceptable – but Scutt offers a note of optimism: “I can’t imagine a school that has addressed something internally and has data from that has not also looked at wider evidence based on helping to understand what might work well in this context.”
If this is the case, it may be that schools can legitimately “retrofit” external research to justify certain spending decisions based on internal research.
Meanwhile, the move to the mandatory use of templates to outline how PP money is being spent and the strategies deployed is another notable development.
This will come into force from December and the templates are available now – both to download to fill in and examples of what it might look like on completion, for primary, secondary and special educational needs schools.
This is clearly a key focus for the DfE, noting that they must be published annually and there will be monitoring checks on a sample of schools’ published reports.
Edmonds says that in one way this is not that big of a change, it just brings more uniformity to something all schools were doing anyway.
“From a governing perspective, I don’t think it’s a major issue because stating how that money is being spent to improve the quality of teaching is something schools already do and boards are used to overseeing.”
As such doing this through a single templated form may not be that difficult – and he notes that the forms have already been available, so many schools may have been using them anyway.
This is the case for Cassiano: “Personally, I do not have an issue as it is something I am working on anyway. Maybe having a template that all schools use could be useful for their scrutiny purpose.”
Hinchcliffe also says she isn’t too worried about this template approach to outlining PP spending because it ultimately should contain the same content as any other means of presentation – which is more important than how it’s presented.
However, she says she would be wary if the requirement to outline these approaches every 12 months effectively meant that an entire new strategy had to be drawn up – not least because proving that a new approach has worked does not usually neatly line up with accountability time frames.
“We can update small elements of the plan each year as we understand what is working well, but certainly not a fresh strategy each year – that goes against evidence-informed approaches. Annual strategies may suit funding, but they don’t suit children’s needs.
“The best strategies to reduce gaps in outcomes don’t tend to be designed in handy 12-month cycles.”
A lack of trust
For some, though, the template move touches on something deeper across education: “The idea of a template is unnecessary and another indicator of a wider issue of not trusting leaders and schools,” says Woodrow.
“I don’t think it will change things too much and it will be a headache for schools and another thing to add to the list.”
He notes, too, that the templates are “not well-designed” as they contain no space for research to be included and so will likely have to be reproduced anyway.
Michael Tidd, headteacher at East Preston Junior School in West Sussex, has also made this observation in a column this week for Tes.
“There’s nowhere clear on the form to set out the evidence underpinning your decisions, despite the fact that this is to be required from this year. Indeed, the model report provided makes no attempt to link their actions to any evidence at all.”
The DfE will, surely, publish new templates to overcome these shortcomings, but, even so, it’s another example of a lack of “joined-up thinking”, according to Woodrow.
A loss of flexibility
Perhaps more fundamentally, though, by being forced into a template approach, there is a concern that schools will lose the capacity and confidence to outline exactly why they have taken certain spending decisions when they don’t fit into a neat, prepared form.
“Previously schools had scope to choose how their strategy was presented and that allowed them to be innovative with that and make it clear what the impact was,” notes Edmonds.
“So some schools may feel constrained by that and it will challenge them.”
This is exactly what Woodrow thinks: “We have worked hard to come up with the right forms and the idea this [templates] will be easier for parents and governors to understand does not make sense as we explain them to our community and why we are making certain decisions.”
Because of this, he says he can imagine that the school may well end up filling in both the templated version to satisfy the governments’ requirements and also produce its existing version that it feels will better outline in full the decision taken and why.
Which sounds a lot like more paperwork.
Dan Worth is senior editor at Tes